alesia
play

Alesia Scribes Chernihova : Ankur Bambharoliya Problem * - PowerPoint PPT Presentation

Lecture 8 Monte Sequential ( cont ) Carlo : Sampling Gibbs Alesia Scribes Chernihova : Ankur Bambharoliya Problem * Models Motivating Hidden Manha : fT# Yt yt t & zz Z z . , , Et Posterior Parameters al :


  1. Lecture 8 Monte Sequential ( cont ) Carlo : Sampling Gibbs Alesia Scribes Chernihova : Ankur Bambharoliya

  2. Problem * Models Motivating Hidden Manha : fT#↳ Yt yt t & zz Z z . , , Et Posterior Parameters al : an fdz PCO Oly ) Fly ) Pc = , t " " Guess from prior - likelihood " ? " Will likelihood weighty Chen work using . ) Z I , 17 :t,t plyiit Ws 0 19 ) PCO ) i= , PCZ ~ ~ it ,

  3. Sequential Monte Carlo Example : ' ) lwi ,x , pH ( wi x ? ) , w ? ,xs , ) ( x ! wi ! ! pcyilx ~ :=

  4. Sequential Monte Carlo Example : 1×9 ! E x : wi ai ) , his ) Disc ( wi CK lynx n p =p i n , , . . . ,

  5. Sequential Monte Carlo Example : ' ) ply l wi X , xp - - \ Hit ,xIl ( wi - . \ cwse.xsesxf.it 1×9 ! E ai XI ) , his ) Was Disc ( wi pcyaix ~ n . , . . . . ,

  6. * Diverse Degenerate set beginning near near indy Sequ~iaMontCaloExa#pk bad " " 2 particles In sampling repeated step prunes

  7. Sequential Importance Sampling Express Idea over : weights product incremental as weight I ft weight . , Hi i ) Kiit , ) Vt ( hit ) ft ) Kiit Tt it - importance , . t.lt/Xiit-i)plyiit-ii4it-i :c . ) ) ply ply Xi it , , - - - = ( I ) X 9 g I IX. ) gcxiit it it Xt , - i i - t Jt Wt wt ) I Xiit M Vt = = t . , - ✓ t illicit i ) t 's qlxtlxy.it , ) q - - , . is weight Incoming Incremental

  8. Importance Resaurpling Perform " by Idea selection natural " selecting : probability proportional their weights to particles with THI wk xk h K qlx ) I n = = . . , . , " " 9 I ( a.k.a ) " Particle " sample . WI ink " ink ) ( Discrete hi ' a n - - . . . , , wh ' I le ' - ¥ h The I E Eh Wh xa = = - & I ,w~h=E I High weight samples often selected more

  9. Resaurplingi Importance Example , li xhrqcx ; , , " The # Law ahndicfws.in " Eh=xa " ) ! : i rcxiwh.mg#Iu 2 4 2 - o 1.5 ) k " C- 1.2 0.5 0.2 1.2 × = - , , , , - ( Is lik lik , II. Is ) Is ( I , 8.0 , 12.0 ) in , 3.0 w - o = zoo . , , , lik = ( 3 ) 5,5 5,4 a , , = ( 1.5 " % ( 5,5 ) " VT " 0.2 ) I 5. 1.2 1.5 1.5 5,5 , , , , , I

  10. Importance Interpretation Resampling : Auxiliary Variable Trick Sample single : xh # a particle " treat auxiliary and x as qW÷ ITa=h ] , ) qcxh ) ( 's ) In " " a ) " qlalx qcx - = , ' u M¥914 jfx k¥1 = ) ( ¥ ) ) ( ldxh-tad.TK ' " f ( x a a ) " at ,x = , jcxa ) = II. "k FIX a ) " I , w w = . - Tix ,

  11. w ! ( General Formulation ) Sequential Monte Carlo Initialize Assume of Sequence targets film ) ) yfx.it : . , . . , sampling i ) , ) ,9t( proposals 9. Cx , ) 92 ( and xi-lx.it Xzlx . , - . . . , M'×' x ! ( te importance I ) with , ) qcx ; ~ = h ) ( In q x , , . ,÷¥ at ! , ) FIE Disc ( q4¥ it it at > : - . . , . , nqkc.la?Ihi)resaupliug Zt , != ttkiith ) × ! wagigeht yt.fx.it?ti)9kthlxaait ' w " I for sequential \ Weight sampling importance

  12. Gibbs Sampling Propose Idea 1 time at variable : a , holding other variables constant ) C ply y x pix ,Xa ) ) Ix Xz = , , . I y X , / ' pix ) Xz X. ply Xa ) ply Xi r = , , peaty :X , , , 's xi - Acceptance Ratio Car accept prob I with : ) ) = i ) × P' " KHz Pl 9. ) min ( bi ply ,xz ' on , = , ply ,xi,Xz7 X , ,×z ) ) pay ,xz pcy , I =

  13. Mixture Gaussian ( Gibbs Sampling 2) Homework : Model Generative Grapical Model let K , Eh I ) Mh pipe ~ . , , , . . b " . it ) Discretely N ' In ~ n = . . - . . , , , fun , fu ) Norm ×nlZn=h n g a Gibbs Sampler Updates he Local variables I ) pl N Fnl I 2- Yu , µ n n - - u , . . , . , Global variables . ) 6--1 Eun plpele.ch/yiin,Ziix , K 14h , - ,

  14. Gibbs Independence Conditional Sampling : 114,2 I ym-tapm.eu Kith Local Variables : N M E ) ) I 71in ply ,7n1µ 114 pcyn = .im , , , h = I , @ ) Global ,Zn=h I plyn µ , @ ) , µ = - 9 I I ) § pcyu ,7u=l I pl7n=h1yn compute µ Can , 044K ) updates all in Mh Variables k : .FI/Ue-th.Ie*u/zplyiiu./u,E12-iin)--Mpyuu.Eu)MNonm/ynspu , h 2- n=h n : - I -

  15. Gibbs Global Update Sampling : likelihood conjugate Idea Ensure that is to prion : conjugate prior likelihood far h cluster posterior - - I ) I Mpcynlzih.in , plmh.su/pCMk,EulyiiN :tn=h3 µ ) In , 2- = n h ) M P' but lnizi.az - - likelihood marginal

  16. Families Exponential An family has form exponential distribution the Depends Only Only depends depend x on on I I and y x an m d hcx ) I YT aey ) ) text pcx y ) I exp = - n E leg ÷ : normalizer ( " , Lebesgue ) ( Canting Base measure ( only depends ) X an

  17. Gaussian Example Uniuaniate ; exp[ yttcx ) ] hcx ) pcxiy ) acy ) = - µ Panama k¥2 ' expf ; ] Dependent " # 2) = = ( zntzj "expft(x2 . z×µ+µ2)/ . ] ( tcx ×2 ) ) × = , ( µ1r2 ) -1/262 n = , log 1262 µ2 6 acy ) + = hcxl 16 1 =

  18. Properties of Families Exponential [ yttcx 1 y ) ] hcx ) ( exp ) ) acy p x = - Derivatives Moments Normalizer Log of . . / = |d× hkiexpfyttki ) ) lacy 1 dx exp pixiy ) → = = gqµg/d× ) ) ) ( yttcx has again exp ' 7) / _.-- # '× P hcxiexplytfki ) taxi ax = = / 1 yitk hki expel alnl ) I exp - ax ) ( hkieaplyttki = |d× . acy ) ))= Epcxiy ,[ ) ] tix tax

  19. Properties Families of Exponential [ yttcx ) ] hcx ) exp pcxiy ) acy ) = - Moments from of computable derivatives acy ) - are a a 'Y tkih ] ) [ # = pain dqn linearly tk independent exponential When - ) are an known family minimal is as family Far minimal and acy ) is convex any - tcxi ] Epa , , ,[ c→ y µ := pcxiy )[ Hxl ] ) from there ( to # to i. 1 mapping n is a .

  20. : plx Conjugate priors Likelihood : , ] hey petty tix ) acy exp hlx ) = - lqttcx hey i Conjugate prior : := D , .dz ) ( d is exp IT ' I " act hip payed tch ) ) = - ( y tip - acyl ) , Joint hey ) explyt ( i 9 , ) , yl rt Da ) ) aim ( aids ) i - - - exploit - augier acts ) explant ) acts ] ) hey , - , = - # . I IT ) x ) p C pcyl 7 - aids ) Fa -

  21. Conjugate priors Joint ? : hex 51 ) ] I explored ) d alt fix , pint pcyix : pox hcx - = = + plx.nl , , I Dat I = a fdypcx.nl Marginal act ] I ' aol exp , ) = - ptx.gs/plx)=pcy1Dttcxs is g normalizer from marginal leg Com compute Posterior ! ) ) = J family Posterior here Conjugacy saz prior : as - aids ) Fa -

  22. Homework Gibbs Sampling : - likelihood conjugate Idea Ensure that is to prion : conjugate prior likelihood far h cluster posterior - - ) I Mpcynlfihduh.su/plMh,Eul9h :tn=h3 I y µ ) In , Eh , 2- Pl Mk Iim = n M ) lnizi.az/0lYnl7n=h - - likelihood marginal . I this Derive I 9h the ( y , 7 ) ) , Ch p ( Mu homework t = in

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend