stochastic particle methods in bayesian statistical
play

Stochastic particle methods in Bayesian statistical learning P. Del - PowerPoint PPT Presentation

Stochastic particle methods in Bayesian statistical learning P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP INRIA Workshop on Statistical Learning IHP Paris, Dec. 5-6, 2011 Some hyper-refs


  1. Stochastic particle methods in Bayesian statistical learning P. Del Moral (INRIA team ALEA) INRIA & Bordeaux Mathematical Institute & X CMAP INRIA Workshop on Statistical Learning IHP Paris, Dec. 5-6, 2011 Some hyper-refs ◮ Feynman-Kac formulae, Genealogical & Interacting Particle Systems with appl., Springer (2004) ◮ Sequential Monte Carlo Samplers JRSS B. (2006). (joint work with Doucet & Jasra) ◮ On the concentration of interacting processes Hal-inria (2011). (joint work with Hu & Wu) [+ Refs] ◮ More references on the website http://www.math.u-bordeaux1.fr/ ∼ delmoral/index.html [+ Links]

  2. Stochastic particle sampling methods Interacting jumps models Genetic type interacting particle models Particle Feynman-Kac models The 4 particle estimates Island particle models ( ⊂ Parallel Computing) Bayesian statistical learning Nonlinear filtering models Fixed parameter estimation in HMM models Particle stochastic gradient models Approximate Bayesian Computation Interacting Kalman-Filters Uncertainty propagations in numerical codes Concentration inequalities Current population models Particle free energy Genealogical tree models Backward particle models

  3. Stochastic particle sampling methods Interacting jumps models Genetic type interacting particle models Particle Feynman-Kac models The 4 particle estimates Island particle models ( ⊂ Parallel Computing) Bayesian statistical learning Concentration inequalities

  4. Introduction Stochastic particle methods = Universal adaptive sampling technique 2 types of stochastic interacting particle models: ◮ Diffusive particle models with mean field drifts [McKean-Vlasov style] ← → Fluid mechanics

  5. Introduction Stochastic particle methods = Universal adaptive sampling technique 2 types of stochastic interacting particle models: ◮ Diffusive particle models with mean field drifts [McKean-Vlasov style] ← → Fluid mechanics ◮ Interacting jump particle models [Boltzmann & Feynman-Kac style] ← → Fluid mech. & Physics + Biology + Engineering + Statistics

  6. Lectures ⊂ Interacting jumps models ◮ Interacting jumps = Recycling transitions = ◮ Discrete time models ( ⇔ geometric rejection/jump times)

  7. Genetic type interacting particle models ◮ Mutation-Proposals w.r.t. Markov transitions X n − 1 � X n ∈ E n . ◮ Selection-Rejection-Recycling w.r.t. potential/fitness function G n .

  8. Equivalent particle algorithms Sequential Monte Carlo Sampling Resampling Particle Filters Prediction Updating Genetic Algorithms Mutation Selection Evolutionary Population Exploration Branching-selection Diffusion Monte Carlo Free evolutions Absorption Quantum Monte Carlo Walkers motions Reconfiguration Sampling Algorithms Transition proposals Accept-reject-recycle

  9. Equivalent particle algorithms Sequential Monte Carlo Sampling Resampling Particle Filters Prediction Updating Genetic Algorithms Mutation Selection Evolutionary Population Exploration Branching-selection Diffusion Monte Carlo Free evolutions Absorption Quantum Monte Carlo Walkers motions Reconfiguration Sampling Algorithms Transition proposals Accept-reject-recycle More botanical names: bootstrapping, spawning, cloning, pruning, replenish, multi-level splitting, enrichment, go with the winner, . . . 1950 ≤ Meta-Heuristic style stochastic algorithms ≤ 1996

  10. A single stochastic model Particle interpretation of Feynman-Kac path integrals

  11. Genealogical tree evolution (size,time)=( N , n ) = (3 , 3) ✲ • ✲ • = • • • ✲ ✲ • • • • = • ✲ ✲ ✲ ✲ • • • • = • ✲

  12. Genealogical tree evolution (size,time)=( N , n ) = (3 , 3) ✲ • ✲ • = • • • ✲ ✲ • • • • = • ✲ ✲ ✲ ✲ • • • • = • ✲ Meta-heuristics � ”96’ Meta-Theorem” : Ancestral lines ≃ i.i.d. path samples w.r.t. Feynman-Kac measure   Q n := 1   � G p ( X p ) P n with P n := Law ( X 0 , . . . , X n ) Z n   0 ≤ p < n

  13. Genealogical tree evolution (size,time)=( N , n ) = (3 , 3) ✲ • ✲ • = • • • ✲ ✲ • • • • = • ✲ ✲ ✲ ✲ • • • • = • ✲ Meta-heuristics � ”96’ Meta-Theorem” : Ancestral lines ≃ i.i.d. path samples w.r.t. Feynman-Kac measure   Q n := 1   � G p ( X p ) P n with P n := Law ( X 0 , . . . , X n ) Z n   0 ≤ p < n & Inversely!

  14. Genealogical tree evolution (size,time)=( N , n ) = (3 , 3) ✲ • ✲ • = • • • ✲ ✲ • • • • = • ✲ ✲ ✲ ✲ • • • • = • ✲ Meta-heuristics � ”96’ Meta-Theorem” : Ancestral lines ≃ i.i.d. path samples w.r.t. Feynman-Kac measure   Q n := 1   � G p ( X p ) P n with P n := Law ( X 0 , . . . , X n ) Z n   0 ≤ p < n & Inversely! � example Q n = Law (( X 0 , . . . , X n ) | X p ∈ A p , p < n ) ⇐ ⇒ G n = 1 A n

  15. Particle estimates More formally � � ξ i 0 , n , ξ i 1 , n , . . . , ξ i := i -th ancetral line of the i -th current individual = ξ i n , n n ⇓ 1 � n , n ) − → N →∞ Q n δ ( ξ i 0 , n ,ξ i 1 , n ,...,ξ i N 1 ≤ i ≤ N

  16. Particle estimates More formally � � ξ i 0 , n , ξ i 1 , n , . . . , ξ i := i -th ancetral line of the i -th current individual = ξ i n , n n ⇓ 1 � n , n ) − → N →∞ Q n δ ( ξ i 0 , n ,ξ i 1 , n ,...,ξ i N 1 ≤ i ≤ N ⊕ Current population models n := 1 � η N n − → N →∞ η n = n -th time marginal of Q n δ ξ i N 1 ≤ i ≤ N

  17. Particle estimates More formally � � ξ i 0 , n , ξ i 1 , n , . . . , ξ i := i -th ancetral line of the i -th current individual = ξ i n , n n ⇓ 1 � n , n ) − → N →∞ Q n δ ( ξ i 0 , n ,ξ i 1 , n ,...,ξ i N 1 ≤ i ≤ N ⊕ Current population models n := 1 � η N n − → N →∞ η n = n -th time marginal of Q n δ ξ i N 1 ≤ i ≤ N ⊕ Unbiased particle approximation   �  � � Z N η N  = n = p ( G p ) − → N →∞ Z n = E G p ( X p ) η p ( G p ) 0 ≤ p < n 0 ≤ p < n 0 ≤ p < n

  18. Particle estimates More formally � � ξ i 0 , n , ξ i 1 , n , . . . , ξ i := i -th ancetral line of the i -th current individual = ξ i n , n n ⇓ 1 � n , n ) − → N →∞ Q n δ ( ξ i 0 , n ,ξ i 1 , n ,...,ξ i N 1 ≤ i ≤ N ⊕ Current population models n := 1 � η N n − → N →∞ η n = n -th time marginal of Q n δ ξ i N 1 ≤ i ≤ N ⊕ Unbiased particle approximation   �  � � Z N η N  = n = p ( G p ) − → N →∞ Z n = E G p ( X p ) η p ( G p ) 0 ≤ p < n 0 ≤ p < n 0 ≤ p < n n = � proportion of success − Ex.: G n = 1 A n � Z N → P ( X p ∈ A p , p < n )

  19. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  20. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  21. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  22. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  23. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  24. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  25. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  26. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  27. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  28. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  29. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  30. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  31. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  32. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  33. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  34. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  35. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  36. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  37. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  38. � n := 1 Graphical illustration : η N 1 ≤ i ≤ N δ ξ i n ≃ η n N

  39. Complete ancestral tree when G n − 1 ( x ) M n ( x , dy ) = H n ( x , y ) λ ( dy ) Backward Markov chain model Q N n ( d ( x 0 , . . . , x n )) := η N n ( dx n ) M n ,η N n − 1 ( x n , dx n − 1 ) . . . M 1 ,η N 0 ( x 1 , dx 0 ) with the random particle matrices: n ( x n +1 , dx n ) ∝ η N M n +1 ,η N n ( dx n ) H n +1 ( x n , x n +1 )

  40. Complete ancestral tree when G n − 1 ( x ) M n ( x , dy ) = H n ( x , y ) λ ( dy ) Backward Markov chain model Q N n ( d ( x 0 , . . . , x n )) := η N n ( dx n ) M n ,η N n − 1 ( x n , dx n − 1 ) . . . M 1 ,η N 0 ( x 1 , dx 0 ) with the random particle matrices: n ( x n +1 , dx n ) ∝ η N M n +1 ,η N n ( dx n ) H n +1 ( x n , x n +1 ) Example: Normalized additive functionals 1 � f n ( x 0 , . . . , x n ) = f p ( x p ) n + 1 0 ≤ p ≤ n ⇓ 1 � Q N η N n ( f n ) := n M n ,η N n − 1 . . . M p +1 ,η N p ( f p ) n + 1 0 ≤ p ≤ n � �� � matrix operations

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend