Alesia Scribes Chernihova : Ankur Bambharoliya Problem * - - PowerPoint PPT Presentation

alesia
SMART_READER_LITE
LIVE PREVIEW

Alesia Scribes Chernihova : Ankur Bambharoliya Problem * - - PowerPoint PPT Presentation

Lecture 8 Monte Sequential ( cont ) Carlo : Sampling Gibbs Alesia Scribes Chernihova : Ankur Bambharoliya Problem * Models Motivating Hidden Manha : fT# Yt yt t & zz Z z . , , Et Posterior Parameters al :


slide-1
SLIDE 1 Lecture 8 : Sequential Monte Carlo ( cont ) Gibbs Sampling Scribes : Alesia Chernihova Ankur Bambharoliya
slide-2
SLIDE 2 Motivating Problem : Hidden Manha Models

fT#↳

yt Yt & Z , zz . z , t

*

al :

Posterior

an Parameters Et Pc Oly ) = fdz PCO , Fly ) t " Guess " from prior Will likelihood weighty work ? " Chen " using likelihood
  • .
~ PCO) Z I , ~ PCZ , it 19 ) Ws i= plyiit 17 , :t,t )
slide-3
SLIDE 3 Sequential Monte Carlo : Example lwi ,x , ' ) ( wi , x ? ) ( w ? ,xs , ) x ! ~

pH

wi := pcyilx ! !
slide-4
SLIDE 4 Sequential Monte Carlo : Example ai n Disc ( wi , . . . , his ) x : n p CK

1×9 !

) wi i =p lynx , E ,
slide-5
SLIDE 5 Sequential Monte Carlo : Example
  • l wi
, xp ' ) X \
  • ( wi
. ,xIl Hit

\

cwse.xsesxf.it ai n Disc ( wi , . . . , his ) XI ~

ply

.

1×9 !

) Was pcyaix . E ,
slide-6
SLIDE 6 Degenerate Diverse set

Sequ~iaMontCaloExa#pk

near beginning near

indy

*

2 sampling step repeated In " prunes " bad particles
slide-7
SLIDE 7 Sequential Importance Sampling Idea : Express

importance

weight

as product
  • ver
incremental weights Tt ( hit ) ft Kiit ) I ft . , Hi it , ) Vt . , Kiit
  • i )
ply , it , Xi :c . ) ply t.lt/Xiit-i)plyiit-ii4it-i )
  • =
  • 9
( X , it I g I Xt IX. it
  • i
) gcxiit
  • i
) t Jt I Xiit )

Wt

= t

wt

  • ,
= M Vt . ✓ t
  • illicit
  • i )
qlxtlxy.it . , ) q t 's , is Incremental

weight

Incoming weight
slide-8
SLIDE 8 Importance Resaurpling Idea : Perform " natural selection " by selecting particles with probability proportional to their weights wk =

THI

xk n qlx ) h = I , . . . , K 9 " " I " Particle " ( a.k.a . sample ) a " n Discrete ( hi ' , . . . , ink ) ink
  • WI
I wh ' le ' h The = I Eh = xa E
  • ¥
Wh I

&

,w~h=E I High weight samples selected more
  • ften
slide-9
SLIDE 9 Importance Resaurplingi Example ;

rcxiwh.mg#Iu

, xhrqcx ,

li

,

: i ! The # Law " Eh=xa " ahndicfws.in ")
  • 2
  • 2
4 × " k =

C-

1.2 ,
  • 0.5
, 0.2 , 1.2 , 1.5 ) lik lik w = ( I .
  • ,
zoo , 3.0 , 8.0, 12.0 ) in
  • (Is
, Is , II. Is , ) lik a = ( 5,4 , 5,5 , 3 ) I " " = ( 1.5 , 1.2 , 1.5 , 1.5 , 0.2 ) VT "% ( 5,5 , 5. 5,5 ) I
slide-10
SLIDE 10 Importance Resampling : Interpretation Auxiliary Variable Trick : Sample single particle x " and treat xh # a as auxiliary ITa=h ] qcx " " , a ) =

In

qcxh ) (

qW÷

, )
  • qlalx
" 's ) u ' =

k¥1

f ( x " ,x " a , a ) =

jfx

) (

M¥914

) (¥)

ldxh-tad.TK ' at w = FIX "k , a ) = jcxa ) Tix ,
  • II.
w " . I
slide-11
SLIDE 11 Sequential Monte Carlo

( General

Formulation ) Assume : Sequence
  • f
targets film ) , . . . , yfx.it ) and proposals
  • 9. Cx , )
, 92 ( Xzlx , ) , .
  • .
. ,9t( xi-lx.it . i )

Initialize

( te I ) ; w ! = M'×' x ! ~ qcx , ) with importance q , ( x , h )

sampling

it > it : at !
  • Disc (q4¥
, , . . . ,÷¥ , )

FIE

. at

In

wagigeht w ,!= ttkiith ) Zt ' × !

nqkc.la?Ihi)resaupliug

yt.fx.it?ti)9kthlxaait " I \ Weight for sequential importance sampling
slide-12
SLIDE 12 Gibbs Sampling Idea : Propose 1 variable at a time , holding
  • ther
variables constant y C x ) = ply Ix , ,Xa ) pix . , Xz ) X , ' r pix , I y Xz ) = ply , X. , Xa ) / ply , Xi xi
  • peaty :X
, 's Acceptance Ratio : Car accept with prob I =
  • n
= min ( , , P' bi × i ' " ) Pl 9. KHz ) ply ,xz ) ) pcy , X , ,×z ) pay ,xz ) ply ,xi,Xz7 = I
slide-13
SLIDE 13 Gibbs Sampling : Gaussian Mixture ( Homework 2) Grapical Model Generative Model Mh , Eh ~ pipe , I ) let , . . . , K b " In ~ Discretely , . . . it ) n = ' ,
  • .
. , N g a ×nlZn=h n Norm fun , fu ) he Gibbs Sampler Updates Local variables 2- u n pl Fnl Yu , µ , I ) n
  • I
, . . . ,

N

Global variables 14h , Eun plpele.ch/yiin,Ziix . ) 6--1 ,
  • , K
slide-14
SLIDE 14 Gibbs Sampling : Conditional Independence Local Variables : Kith I ym-tapm.eu 114,2 N ply .im , 71in 114 , E ) = M pcyn ,7n1µ , I ) h = I plyn ,Zn=h I µ , @ )

pl7n=h1yn

, µ , @ ) =
  • 9
Can compute I § pcyu ,7u=l I µ , I ) all updates in 044K )

Global

Variables : k Mh .FI/Ue-th.Ie*u/zplyiiu./u,E12-iin)--Mpyuu.Eu)MNonm/ynspu , h
  • I
n : 2- n=h
slide-15
SLIDE 15 Gibbs Sampling : Global Update Idea : Ensure that prion is conjugate to likelihood likelihood conjugate prior posterior far cluster h
  • I
Mpcynlzih.in , I )

plmh.su/pCMk,EulyiiN

, 2- n µ ) = In :tn=h3 M
  • lnizi.az
P' but h )
  • marginal
likelihood
slide-16
SLIDE 16 Exponential Families An exponential family distribution has the form Only depends
  • n
x Depends
  • n
I Only depend I y and x an m d pcx I y ) = hcx ) exp I YT text
  • aey ) )
n E leg normalizer

(

÷ :

" Base measure ( Canting , Lebesgue ) ( only depends an X )
slide-17
SLIDE 17 Example ; Uniuaniate Gaussian pcxiy ) = hcx ) exp[ yttcx )
  • acy
) ] =

#2)

' " expf ;

k¥2

] Panama

µ

Dependent = (zntzj"expft(x2 . z×µ+µ2)/ . ] tcx ) = ( × , ×2 ) n = ( µ1r2 ,
  • 1/262
) acy ) = µ2 1262 + log 6 hcxl = 1 16
slide-18
SLIDE 18 Properties
  • f
Exponential Families p ( x 1 y ) = hcx ) exp [ yttcx )
  • acy
) ] Moments . . Derivatives
  • f
Log Normalizer

/

dx pixiy ) = 1 exp lacy ) ) = |d× hkiexpfyttki

again

= gqµg/d× has exp ( yttcx ) )) / ax taxi hcxiexplytfki ) P ' 7) =

_.--

= expel
  • alnl )
I / ax hki exp 1 yitk

#

= |d× tax ) ( hkieaplyttki . acy )))= Epcxiy ,[ tix ) ]
slide-19
SLIDE 19 Properties
  • f
Exponential Families pcxiy ) = hcx ) exp [ yttcx )
  • acy
) ]
  • Moments
are computable from derivatives
  • f
acy ) a dqn a 'Y = # pain ) [ tkih ]
  • When
tk ) are linearly independent an exponential family is known as minimal
  • Far
any minimal family acy ) is convex and µ := Epa , , ,[ tcxi ] c→ y ( there is a i. to . 1 mapping from n to # pcxiy )[ Hxl ] )
slide-20
SLIDE 20 Conjugate priors Likelihood :

petty

) = hey exp

lqttcx

)
  • acy
, ] Conjugate prior : i D is ( d , .dz ) payed ) = hip exp IT " tch )
  • act
' I tip

:=

( y ,
  • acyl )
Joint : plx , yl i
  • hey
hey) explyt (

tix

) i 9 , )
  • aim (
rt Da )
  • aids
) =

hlx

) hey, exploit ,
  • augier
  • acts ) explant )
  • acts ]
. I # pcyl IT ) p C x ) 7
  • aids )
Fa
slide-21
SLIDE 21 Conjugate priors Joint :

plx.nl

=

hex

?

pint 51 explored )
  • alt
) ] I , = d , + fix , I a = Dat I Marginal :pox ) is

fdypcx.nl

=

hcx

, exp I act '
  • aol
] g Com compute marginal from leg normalizer Posterior !

pcyix

) =

ptx.gs/plx)=pcy1Dttcxs

) J Conjugacy : Posterior here saz family as prior
  • aids )
Fa
slide-22
SLIDE 22 Gibbs Sampling : Homework
  • Idea
: Ensure that prion is conjugate to likelihood likelihood conjugate prior posterior far cluster h
  • I

Mpcynlfihduh.su/plMh,Eul9h

) Pl Mk , Eh I y Iim , 2- n µ ) = In :tn=h3 M
  • lnizi.az/0lYnl7n=h
)
  • .
marginal likelihood Derive this I in homework = p ( Mu , Ch I 9h t the ( y , 7 ) )