Variation al
Auto
encoders Scribes :Ming
- ming
Ming Colin Scribes ming : - , Motivation Inferring from - - PowerPoint PPT Presentation
l Variation al Auto 16 Lecture encoders : Ming Colin Scribes ming : - , Motivation Inferring from Images Latent Variables : Dataset MNIST Goh images ; hand of digits written - Goal two variables Infer labels 1.
Variation al
Auto
encoders Scribes :Ming
Inferring
Latent Variables from Images Dataset MNIST ; Goh imagesDigit
labels y e { ,Deep
Generative Models equally probable Andisi ")
Ideal : Use a Neural Network to define a generative model f Digit label {Training
Deep Generative Models Idea 2 : Use Stochastic Gradient Ascent to perform maximum likelihood estimation pcx , y ,7 :O) 110 )= # pc , [ log )lg §
,T.logpcx.ci
" , " " so ) z 'sTraining
Deep Generative Models Idea 3 : Use Stochastic Gradient Ascent to perform Vaniational inference Llo ,¢)= E[ log
Pl×' " 750 ' ) slogpcx
:o) qcyitso ) qcy ,7 :O) Combining 2+3 iPerform
gradient ascentto
£ ( 0,9 ) ¢* =angngtn
KL(
qlyitso )Hpcyitlx
) )Training
Deep Generative Models Idea 4 : Use neural network to define the inference model ( a.k.a. the variatianal dist . ) Digit label ynt{ , ... ,9 } ( have supervision ) Inference model Image yn ~ Discrete ( FY ( xn ; 019 )) P xe R|
7n a Normal(
ittlxmynsgts
) Vaniaticnalqly.tl/)=M9olyn.7nlXn
) ( no supervision ) nilVariation al
Autoencoders Objective : Learn a deep generative model and a corresponding inference Medel byTomko
.n=0a¢( E
' #www.nlbspgfxiy?niI*
Mapping
from image x to latent code z ¢h e RHK e pzasWhxn
+bh
)
±
new mapping a Zooh ~ 500 7 7 pwnams pawns zn , ; = 6 ( W hn t b )÷
7 Activation functions 6kt X÷
In = 6 (,¢)=§l,§.!×n,p
log In ,p + ( l{
Minimize this log ( c- In ,p ))Encoded
Generative Model( Decade )
91711in 's Q ) plxn ,7n ; 0 ) hn =6( Whxnt
bh ) 7h ~ Normal ( 0 , I ) µ ? = wtuhntbl " hn = 6( When +bh )
a
= exp ( w ' hntb ' ] y×n=6(W×hn+b× )
7h ~ Normal ( µI , bit I ) xn ~ Bernoulli( y×n )
Objective : Llo , a) =Ean
. ;o, ,[ logp "gYzYa's
]
Training
: The Re parameterization Trickfowl
: Compute gradient.iq#a.p.,n.lbgaF
]
"Analogue
fqLl0ioD-2stE.qlegqoaHnxd@gP093L.a
)
b= , 9¢(7,141
( sl Problem : might be highTraining
: The Re parameterization Trick Idea : Samplezb
" ' using a teparameterized dist eb " ' ~ Normal ( ° ' ' 7}
as'~µmm(µ?66
) 75 " =pit ( xb
;¢Mlt6( xb 's F) Eb 's ' \ Result : Re parameterized Estimatorto
£ ( O ,¢ ) = Do ,|£ , # pc e) [ 68 qgczks.es:491 's ) B S pqkb , 2- ( Xb ,{bs¢tD=f[ [
0gleg
Variation al
Autoencoders Objective : Learn a deep generative model and a corresponding inference Medel byTomko
.n=0a¢( E
' #www.nlbspgfxiy?niI*
Practical
Implementations : Tensor flow/
Py Torch( 7
encodes both style & digit ) Continuous + Discrete : pfxn , ynifnl , qfbn ,7n lxn ) ft encodes style ):
n ,7