jay
play

Jay : Seaman Iris Last Lecture : Importance Sampling Xsnqcx - PowerPoint PPT Presentation

Lecture Markov Monte to Carlo Chain : De Young Scribes Jay : Seaman Iris Last Lecture : Importance Sampling Xsnqcx Generate from Idea samples ) : distribution similar that proposal to ) Mk a is weight MCN high : y


  1. Lecture Markov Monte to Carlo Chain : De Young Scribes Jay : Seaman Iris

  2. Last Lecture : Importance Sampling Xsnqcx Generate from Idea samples ) : distribution similar that proposal to ) Mk a is weight MCN high : y underrepresented gcxi E ""iµq%*+ use :-. . overrepresented ✓ proposals by x fdxtrcxlfki Elftlx ) ) = = = xsngcxl = =

  3. Likelihood Weighting Default Choice fan : Bayes Net Assume Proposal = ply ) 2- ycx ) = = = Prior Proposal Set to l x ) q = Likelihood Weights Importance : = & Is f Ks ) us = F- s , = [ W S - i - ' s = )

  4. Problem Models Motivating Hidden Markov : §→€→ Yt yi t & Ze 7 z . , , ← It Posterior Goal Parameters : an O I y ) PC = t libel ? Will ihocd weighty wah I ' O w =

  5. Markov Chain Monte Carlo S I Use - Idea to previous sample propose : x the ' sample neat x A of Masher Chain variables random : sequence " ,Xs X when Markov chain ) I discrete-time is a . . . , " xslxs ' ' - Ix ,x , . . . ) 1×51×5-1 - ' ' is ( x' ) p I x = p Markov A Chain homogenous when is ' ) pcxs.es/Xs-!xs-i)=pCX'=xslX-xs .

  6. Manha Monte Carlo Chain A Manha Convergence chain to converges : density target when ( 17 ) x a p(Xs=x lying ) n(X=× ) = . "II÷u III : 2 *¥*y¥¥± . which X=x in visited is , " frequency " with h(X=x ) J

  7. Markov Monte Carlo Chain Balance A Markov Detailed chain homogenous : balance detailed satisfies when ' ) plxcx 't I X' Ix ) MCX ) Mix p = invariant leaves Implication ' 1×1 pcx Mix ) i x ) 17 C = = = ' start If with sample next x a you n ' ) then then ' sample pl Mex ) and XIX XIX ~ x n

  8. Metropolis Hastings - from Starting xs sample Idea current the : = nain accept proposal qcxixs ) ' generate and n a x ' xst probability with ' × = . ) ( I a , probability proposal with reject the C ) o I - "=xs the xs retain sample and previous

  9. Metropolis Hastings - from Starting xs sample Idea current the : = mmin accept proposal qcxixs ) ' generate and ~ a × '=x xst probability with ' ' ) ' ) M q( ( XIX . X ) ) ( l a i qkllx ( × ) n probability with ( proposal reject the ) l d - "=xs xs the retain sample previous and Show Markov Exercise the that chain : balance xs ' detailed satisfies x . . .

  10. - Hastings Balance Detailed etropolis : Balance Detailed : ' ) M ( I ' Ix ) X ) pcxcx 't Mix X = p - Hastings Metropolis Define : kcx Kix = minfl ' IN a = , 7457,44272¥ ' Ix ) MCA = min ( ) + , =

  11. - Hastings Densities Unrormalired etropolis : Nice Can calculate property prob acceptance : ' ) from ✓ C x ) densities unhormaliced and jcx i mix mmin ' ) ' I 17 ( 9 ( XIX . x ) ) ( I a = qcx ' Ix , nain - ) ( I = , main 81×1 ) = ( I = ,

  12. - Hastings Metropolis Choosing proposals i Sample from Mtt Independent proposers : pretor - ( IX ) ' ) ' sample ) C X Independent from 9 pcx = prau . in ) ( I a mm = , in mind ) ) ( I mm - = = , , but low Straightforward acceptance prob , ( )

  13. and ( " dark of - Hastings Metropolis Choosing proposals : 2 Gaussian variables Continuous : Norm ( 82 ) • ' ' ) qcx Ix x x ; . = , ' small or far proposal Trade off van , - proposal Symmetric 82 small prob too good acceptance A - ; , 91×4×1=941×1 ) but high correlation between samples min ( ) ← I less I 82 but correlation large too - : , min ( lower prob ) acceptance A I = - , Rule of ' thumb of time I 0.234 to make 9 ,

  14. Lecture ) Gibbs ( Next Sampling Propose Idea 1 time at variable : a , holding other variables constant ) C ply y x pix ,Xa ) ) Ix Xz = , , . X , / ' X. ply Xa ) ply xi n = , , , xi - Acceptance Ratio Car accept prob I with : = min ( ) A I = , , I =

  15. MCMC Importance Sampling vs . , O ) IZ O ) 2- ply ) ) 8107 ply NCO y ( = = = Sampling importance i - P' {j% Anglo Elws ) ws on geo ) pigs = - - , a y I check Gives of Guess and estimate marginal - Hastings Metropolis = { " , - Unit ( 0,1 ) Os " '1o u > a u ps min ( 8105910541014 " ) 9104054 , ) O ca , n g = , " HO " but " estimate of marginal climbing hill Can do no

  16. Marginal Likelihoods Computing Motivation Model comparison : ' K ? Question ' How clusters : many * Low 109 High if ) ply ply

  17. Marginal Likelihoods Computing Motivation Model : ' K ? Question ' How clusters comparison : many * Low 109 High if ) ply ply Fewer bad bad Lots of 0 O

  18. Marginal Likelihoods Computing Motivation Model : ' K ? Question ' How clusters : many * Low 109 High if ) ply ply comparison Fewer bad bad Lots of 0 0 Bayesian Approach : likelihoods marginal Compare angmax / * K plylk ) do angmax = = " } K km ke { I . . . , ,

  19. Annealed Importance Sampling intermediate Idea Sample from target by of yco ) r.mg I : way distributions , I O ) lol yn ya Idea Use MLMC to generate proposals 2 :

  20. Annealed Importance Sampling O @ . . :-. 4%0%4 . ) - got Oi Initialization w Onsnkn ,( I On ! ) On wins Transition = - . )

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend