sunday
play

Sunday Homework 3 : an Diniohlet Allocation Model Latent - PowerPoint PPT Presentation

Dirichet Allocation Latent Lecture ( Part 11 2) : Jordan Yuan Andrea Scribes : , , Midterm This Wednesday : Due Sunday Homework 3 : an Diniohlet Allocation Model Latent Generative : Generative model : Pu Diricwetlw


  1. Dirichet Allocation Latent Lecture ( Part 11 2) : Jordan Yuan Andrea Scribes : , , Midterm This Wednesday : Due Sunday Homework 3 : an

  2. Diniohlet Allocation Model Latent Generative : Generative model : Pu Diricwetlw ) on Ad ~ Bh Yan Oa Dirichlet ( x ) K ~ Discrete ( Od ) 2- du~ W Zan Discrete ( Pa ) Na 7dn=h ydnl ~ D

  3. Algorithm Gibbs Sampling Collapsed 1 : fibbs ( topic Updates assignments 1 FT Priori 7dn } { Zdn ( 7 7. 7 \ f. p 1 y ~ du = an , Requirement Implementation Need monginals B to O compute and over . , |dOdB ) , p ,O ) pcy ,7 pcyit = ,wv) Conjugate ( W Dinichlet Bun Od~ Diniohktld , , ... ) ,ak , , ...

  4. Gibbs Collapsed LDA Updates : Fan } { Zal Assignments Update far - Taste \ { yd gaa ~ ~ U Yduiv ) ( L PC Z Odh 7dn= Wwv , Y an an . . , + Nuv =T dn Ndh dn whv ~ - - £du Bv Out = = e- " "+EBv Ni Sufficient Statistics .in?nII7dn=h)Nhv=na.n,=,aIjYdni=YIIZa'nih an q . ] Nail " Nid " Ittaneh ) tdh d 'n'

  5. L( { 0,15 } ,q ) Maximization ( LDA Expectation MAP : Objective log , p* ,9P ) Pc 9 O* Bly ) anymax = = aongygaxlgply 0 , B PLYIQB ) ) ,0,B pczly Lower bound p , ,[ Egan ) ) = log.pl#TjB1sloypiy.0.p ) ) ) Hpczly KL( log ply ,O,p ) qttso , 0,13 = - M E step step - . LIQB [ ( 0 ,p,y ) , 10 ) 4 0,13 = angmaa = argqmax 0 B , 91719,01137 this ) ;¢ ) will qlz ( do =

  6. ( Maximization ) Expectation MAP LDA : an P ] Ll{ Egan , ,[ by } , g) = angqngax argampux ;¢|btply # qa ;¢ ) [ log + leg "aYIj÷M ] + leg PIB ) ) Palo log PIG pcyiz ,B ) t angmax = 0 B , Family Representation Exponential , h(x\ [ yt tcx ( acy ] i ) exp ) p × y ) = - )] . ( II #gaµ,§ , B ) ] [ yan=v7I[7an=h ) - leg Bhu Etqcz 17 = ,q , [ { log ) ) ) . ( { Etgn ;¢s[ by 7101 ) Odu Ittaneh Elq , pc = ,

  7. Maximization ( MAP ) Expectation LDA : Lower bound , t.bg#1sbgply.&m B ) , 7. 9. pcy L( { 0,13 } ,q ) Eqa :p = [ { Ndu= ,#17an=h ] ) ¢ndh Expectation Step Fla = : Flpcz , , ,p ,g , [ ) ] ) ) [ I[7an=h Ftq , I[7dn=h ¢ ahh = = , , ,¢ Compute values sufficient statistics of expected Maximization ( exploit ) Step conjugacu : + £I[ + { ydutu ) ¢ Oh cfdnh duh 1 1 wuu - - Odd Phv = = - - + £ + fnI[ { ]¢dnl [ l cfdnl de yaiu - 1 . weu e

  8. LDA Algorithms Other Sampling : Monte Carlo Sequential i ) Requirement Sequence densities intermediate of : ) Marginal PC 994in 7 di :n Single , document p 0 : over B , tin ) ( yn Zd B ) Pcyd / 7dam = iin , , , Marginal aw0 ← 9( I Fd a ) I For 7A 7dm yd lin ,n = PC - tin - , , , :n , - , , " ( Analogous Gibbs sampling ) to Compute was Example Question the weights importance : for generations h > 1

  9. ) Sequential ( Marte Carlo General Formulation Assume Unnmmalized Densities jfkt ) , ) 1 × g. : ... . Importance Sampling First step : s , ) ws ycxsilqkil 9 ( × x. ~ :-. , Propose Subsequent steps from samples previous : ( win int Discrete at ' ~ } ' " ins , . , ' !×" , ) x. ~ it + , . . xst~qcxi.ly?I.7x!t:=x!.x.aiiIi ' , ) xt qcxtix it ~ , . 8+45 .sn#sqcxilxa?I vi. ) it ' Incremental weiswl is a , ' . ,K ft

  10. LDA Other Algorithms Sampling : pPlYd Carlo Monte Sequential .nl?amil3)p(7d,nlbd,i:n-i,7d.l:n , ) i - :^) ( ) yn Fd ,7d pcyd = ' in ' 1 , , , i ,&d ) 9( I Fd For ( P ) tap 7d,i Yd P , iin 1 lin ,n = i - :n i - - , , , .nl/B)pCyd.l:nu,7d.l:n.i Faint Jul 7h :n ) Wns bdy.to?In...lPCyd.i:n%7ad?iin.i1pcydn:n.,#iInilp1pi7'a,nl7IiYnYyd.iin.i.p , = , are ah . Ku , ( ) , ,7d ) 7d 91 Yd :ni , ,i:n , ;n i , . - , pcgda -17in PHI 17in Plbdn .nl ,B > = ) = µMpwI[9nn=D ) Isaiah , B) = v

  11. LDA Algorithms Other Sampling : Hamiltonian Carlo Monte Requires log Gradient of joint : - Vqp log Tqp UC 0,13 ) R ) pcy ,O = , to L ( to .pl ,¢ ) ¢dnEe [ Ittndneh ) ) Epa ,D,p ) iy Example Question LDA Suppose to : were run you Wikipedia of all Would recommend you using on . HMC to ) ? 0,13 plop sample ly x

  12. Monte Carlo Hamiltonian Algorithm : %# =Is ' X. ' step ) ( Single finlpl HMC • . • 2 :X !£5 th ) Norm 18 Fi ~ , x. Edit • , Is ' ' ÷ . e. It ,p , , ,p , ,M,E ,T ) - ← . ( RU , I LEAPFROG ;= - Ft ) ] ) min ( . HKF expl a l = , , How times expttlki many , -511 ) , Tqp UIO .pl ? Uniform ( 0,1 ) calling u are we ~ It T £4 times " { Is ,= Is " d > u

  13. LDA Algorithms Other Sampling : Example Question LDA to Suppose : were run you Wikipedia of all Would recommend you using on . HMC to ) ? 0,13 PIQP sample ly x Answer No for Computing the each gradient ; . leapfrog would full step the data require a press over .

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend