Ming Scribes Colin mthg : Lectures No class Monday : on - - PowerPoint PPT Presentation

ming
SMART_READER_LITE
LIVE PREVIEW

Ming Scribes Colin mthg : Lectures No class Monday : on - - PowerPoint PPT Presentation

Monte Lecture Hamiltonian Carlo 7 : & Ming Scribes Colin mthg : Lectures No class Monday : on Due Homework Friday Z 9 Feb : ( ! this ) start week Summary Monte Carlo Methods : .ws /Z 17k ) jk ) =


slide-1
SLIDE 1 Lecture 7 : Hamiltonian Monte Carlo Scribes : Ming mthg & Colin Lectures : No class
  • n
Monday Homework Z : Due Friday 9 Feb ( start this week ! )
slide-2
SLIDE 2 Summary : Monte Carlo Methods 17k ) = jk ) /Z Importance Sampling xsnqcxl w ' :-. YE,

£s=§§

.ws

Fifa 's Ewsfksl + Very simple , very general + Gives estimate
  • f
normalizing constant f= # [ Is ]
  • Need
good proposal qkl
  • Does
not scale well to high dimensional problems
slide-3
SLIDE 3 Summary : Monte Carlo Methods Sequential Monte Carlo at .,~ Disdain ..FI . ) wi=tff"xIhe

.gl#xaEi..lFs=&wtfKsitixtsnqix+lx?tl

Ese = ftjsfwi + Generic strategy far high
  • dimensional
proposals by performing " natural selective "
  • Also
gives estimate
  • f
marginal 7 =

#

[ Is ]
  • particles
" die
  • ut
" during resamplny sample degeneracy
  • Not
all problems lend themselves to Sequential decamp .
slide-4
SLIDE 4 Manha Chain Monte Carlo Convergence : A Manha chain converges to a target density 17 ( x ) when

lying

.

p(Xs=x

) = n(X=× )

.

€*t¥¥

,

III

"

:

"I:I÷u

. in which X=x is visited with " frequency " h(X=x ) J
slide-5
SLIDE 5 Markov Chain Monte Carlo Metropolis
  • Hastings
Gibbs Sampling X ' ~ qcxlxs " ) y( × ) = ply ,x , , xz ) a i. min ( , ,8c× ' ' 9k " K 't × ? ~ pix.ly/x5 's ycxs " )qK' 1×5 " I) Xsz ~ p( Xzly ,X , s ) n ~ Uniform ( 0,1 )
  • ×s
,

{

× ' " so t Much less correlation xs " u > a between samples + Really general
  • Deriving
conditional
  • Tune
proposal to
  • ptimize
updates hard / impossible accept natto
slide-6
SLIDE 6 Hamiltonian Monte Carlo : Motivation Intuition : MCMC Algorithms combine " hill climbing ' with " stochastic exploration " 2
  • Can
  • ften
talk bulh

t.vn/@#*h.yg

, at the computation continuous

) (

Idea : Can we use gradients P×z(×\ ?

=

1 . Mill climbing : and the mode
  • f
distributor 2 . Exploration : Characterization
  • f
variance around mode
slide-7
SLIDE 7 Hamiltonian Monte Carlo : Motivation Idea ; Think
  • f
density as an " energy landscape " and Sumpter as a " marble " moving around Uk ) =
  • leg
ycx ) s
  • l
  • Jeannie

;tum

. Proposal mechanism :

¥

Simulate the trajectory ⇐ 11 "
  • f
a " marble " in energy landscape Ma in f. ( xl correspond to minima Uk )
slide-8
SLIDE 8 Hamiltonian Monte Carlo Auxilli any variables : Density
  • n
extended space " Postttm "
  • "
momentum " gexptklpl]

> ycx , p ) = exp [
  • UK
)
  • KCFH
= yc Is jcp ? UII ) =
  • log
jrcxs Potential Energy K ( f ) = FTM ' ' f 12 Kinetic Energy \ A matrix " Man " Target density : rice , F) =

nig

=

rtf

nd Fp

rikkfrpnkipi

= rbgy|dFt¥F = k¥1 = next
slide-9
SLIDE 9 Hamiltonian Monte Carlo : Algorithm Auxill : any variables : Density
  • n
extended space

FCI

, F ) = exp [
  • UCI
)
  • KCF
) ) =

ycilylp

UII ) =
  • log yixl
Auxiliary variables : F K ( f ) = FTM . ' 1512 tick

:|

drink ,B ) =nKl

Target

Density ,

/

kmalsowomifnnkipknknlpkl
  • 1545,15 )
= MCI ) tilpl Throw away F Is , p 's ~ Fix ,§ ) Es ~ mix )
slide-10
SLIDE 10 Hamiltonian Monte Carlo
  • X. =K"
' nap Proposal : Simulate trajectory
  • §
\

2.ws#

{ ( ± t.pt

' }t= , Fit

ix.

.
  • .
p , ~ nlf ) Gibbs sampling § ° ° ← , ° I , is Is
  • ,
I ' i. It } MH

update

. an I , p § ' : =
  • ft
Accept a reject : a = mn( ,

FHFIFH

9

limits

)

pfcxo

"

,Ft9kiiP¥iP"

1) Ensure that is close to 7 , Reversible proposal
slide-11
SLIDE 11 Hamiltonian Dynamics X. =Is ' ' .
  • .
Fiat ' if ( I ,F ) = up 1- UKT . Kept ] " .!§f) = exist . Hk ,F ) ] text

ix.

Hlxipl

:=

UKTTKIF ) ° ° ← , ° a = mm ( , , ¥146 , . Ft) ] expttlk , ,p,1 ] ) Conservation
  • f
Energy : Solution : ( Conserves 14 ) DI :-. DH OF OH

dY= 2¥ oayitdqldp =o

dt
  • f

Ftiiox

dt , dt DH 214214

dF=oE

  • f
  • ftp.#t=o
slide-12
SLIDE 12 Hamiltonian Dynamics X. =Is ' ' .
  • .
Fini ' if ( I. F) = expfuk . ) . KF ) ] 2

:###..

= expl
  • Hlxjpl ]
Fix ' '

:X

. Hlxipl = UKI + Klp ) ° ° ← , ° UIEI =
  • log
y ( I ) KIFI = FTM . ' 512 Trajectory ; Integrate Coupled PDES Velocity da§ :c momentum ddx÷ :-. IH = M " F F ' ' ' divided " by mass 14
  • f
Going " downhill " means :-.
  • at
= Txycxi 2I
  • ycxl
gains in momentum
slide-13
SLIDE 13 Numerical Integration Euler E : 0.3 Leapfrog E=a3 r Numerical More Stable \ Instability than Enter P § p

%

x x b.

ycxtts

)

Ict

" El = Icti + N' fate

Fit

+ El = pit ' +

gay

%
  • CH)
  • p
( t + e) = FCH + By { xlttel = xctlt M "pH+4z ) e ' " ' " )

fctt

}{ ) = fatten

)+0×HxttteD_

E jklttel
slide-14
SLIDE 14 Hamiltonian Monte Carlo : Algorithm

iffy

"÷"

"

YIIY

:L .IT

)
  • S
  • 1
X , ' = × It ,pIi= LEAPFROG ( VIU , I , ,p , ,M,e,T )

amine

.it#iIi?otH

Hamiltonian ts conserved b ~ Uniform ( 0,1 ) so acceptance prob depends
  • nly
  • n
Is ,= { It " 54 integration error Is " u > d
slide-15
SLIDE 15 Hamiltonian Monte Carlo : Tuning Parameters

|K.=Is

' ' .
  • .
finlpl 2

;K

, p Tunable Parameters : M , e. T

¥§)

It ,pIi- LEAPFROG ( RU ,I ,

.FM

,e ,T ) *

ixtixo

  • .
° ° ← , ° g Estimate by running sampler M : M = E " when

£

, := E [ xixj ]
  • EKITEK
;) E : Tune fun to achieve acceptance # [47top T : No
  • Utwn
Sampler ( NUTS ) : Step when " doubling bach "
slide-16
SLIDE 16

Debugging

Monte Carlo Methods " Getting it Right "

÷

wehe . style Testing : Assume Bayes Net xn ~ pcxl Sample

#

prtw y " ~

plylxnl

Sampling " data " from likelihood In ~ p(xly " ) Run inference to sample from posterior ( ISISMCIIYHIGBBSIHMC ) xnvpixl .

::IIi±t¥tft

.

It

" " : . . . : . . .