sampling methods ii
play

Sampling Methods II Henrik I. Christensen Robotics & - PowerPoint PPT Presentation

Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Sampling Methods II Henrik I. Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I.


  1. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Sampling Methods – II Henrik I. Christensen Robotics & Intelligent Machines @ GT Georgia Institute of Technology, Atlanta, GA 30332-0280 hic@cc.gatech.edu Henrik I. Christensen (RIM@GT) Sampling Methods – II 1 / 23

  2. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Outline Introduction 1 Markov Chain Monte Carlo 2 Gibbs Sampling 3 Slice Sampling 4 Hybrid Monte-Carlo 5 Summary 6 Henrik I. Christensen (RIM@GT) Sampling Methods – II 2 / 23

  3. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Introduction Last time we talked about sampling methods Generation of distribution estimates based on sampling of the input space Discussed rejection and importance sampling A problem is typically rejection rates and generalization to higher dimensionality spaces Today discussion of methods that generalizes to higher dimensional spaces. Henrik I. Christensen (RIM@GT) Sampling Methods – II 3 / 23

  4. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Outline Introduction 1 Markov Chain Monte Carlo 2 Gibbs Sampling 3 Slice Sampling 4 Hybrid Monte-Carlo 5 Summary 6 Henrik I. Christensen (RIM@GT) Sampling Methods – II 4 / 23

  5. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Markov Chain Monte Carlo We will sample a proposed distribution We will maintain a record of samples - z ( τ ) and the proposal distribution q ( z | z ( τ ) ) Assume we have p ( z ) / ˜ p ( z ) / Z p Assume we can evaluate ˜ p ( z ) Generate a candidate sample z ∗ and accept if a criteria is satisfied. Henrik I. Christensen (RIM@GT) Sampling Methods – II 5 / 23

  6. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Metropolis Algorithm Assume q ( z A | z B ) = q ( z B | z A ) Acceptance criteria is then � � 1 , ˜ p ( z ∗ ) A ( z ∗ , z ( τ ) ) = min p ( z ( τ ) ) ˜ Generate a random number - u ∈ (0 , 1) Update � z ∗ A ( z ∗ , z ( τ ) ) > u if z ( τ +1) = z ( τ ) otherwise Ie. if a new update is better than the old one use it or stick to the earlier estimate The basic Monte Carlo is a limited random walk and as such not over efficient Henrik I. Christensen (RIM@GT) Sampling Methods – II 6 / 23

  7. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Markov Chains Assume we have a series of random variables - z (1) , z (2) , z (3) , ..., z ( M ) First order Markov Chain is defined by conditional independence p ( z ( m +1) | z (1) , z (2) , ..., z ( m ) ) = p ( z ( m +1) | z ( m ) ) The marginal probability is then given by the transition probabilities and the initial prior � p ( z ( m +1) ) = p ( z m +1 | z ( m ) ) p ( z ( m ) ) z ( m ) Henrik I. Christensen (RIM@GT) Sampling Methods – II 7 / 23

  8. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Markov Chain Properties A MC is called homogeneous when all p ( . | . ) are the same A distribution is invariant/stationary if the distribution remains invariant i.e. � p ∗ ( z ) = p ( z | z ′ ) p ∗ ( z ′ ) z ′ A condition for ensuring invariance is that the transition probabilities are detail balanced: p ∗ ( z ) p ( z ′ | z ) = p ∗ ( z ′ ) p ( z | z ′ ) We require that the desired distribution is invariant and converges to this distribution as m → ∞ The property is called ergodicity and the final distribution is termed the equilibrium Henrik I. Christensen (RIM@GT) Sampling Methods – II 8 / 23

  9. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Outline Introduction 1 Markov Chain Monte Carlo 2 Gibbs Sampling 3 Slice Sampling 4 Hybrid Monte-Carlo 5 Summary 6 Henrik I. Christensen (RIM@GT) Sampling Methods – II 9 / 23

  10. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Gibbs Sampling Gibbs Sampling a widely applicable MCMC algorithm Consider a distribution p ( z ) = p ( z 1 , z 2 , ..., z M ) In each step one of the variables is optimized conditioned on the other variables. Example - Consider p ( z 1 , z 2 , z 3 ) Optimized by consideration /sampling of p ( z 1 | z ( τ ) 2 , z ( τ ) p ( z 2 | z ( τ ) 1 , z ( τ ) p ( z 3 | z ( τ ) 1 , z ( τ ) 3 ) 3 ) 2 ) Continue until convergence Henrik I. Christensen (RIM@GT) Sampling Methods – II 10 / 23

  11. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Gibbs Example z 2 L l z 1 Henrik I. Christensen (RIM@GT) Sampling Methods – II 11 / 23

  12. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Gibbs Sampling in Graphical Models Initialize variables in parent tree and traverse tree/graph Henrik I. Christensen (RIM@GT) Sampling Methods – II 12 / 23

  13. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Outline Introduction 1 Markov Chain Monte Carlo 2 Gibbs Sampling 3 Slice Sampling 4 Hybrid Monte-Carlo 5 Summary 6 Henrik I. Christensen (RIM@GT) Sampling Methods – II 13 / 23

  14. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Slice Sampling Metropolis is sensitive to sampling step size Slice sampling combines sampling to explore step size. p ( z ) p ( z ) � � z min z max u u z ( τ ) z z ( τ ) z (a) (b) Henrik I. Christensen (RIM@GT) Sampling Methods – II 14 / 23

  15. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Outline Introduction 1 Markov Chain Monte Carlo 2 Gibbs Sampling 3 Slice Sampling 4 Hybrid Monte-Carlo 5 Summary 6 Henrik I. Christensen (RIM@GT) Sampling Methods – II 15 / 23

  16. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Hybrid Monte-Carlo The Metropolis algorithm has step size issues Introduction of a method with adaptive step size and low reject rates Adoption of a dynamic systems approach to optimization Henrik I. Christensen (RIM@GT) Sampling Methods – II 16 / 23

  17. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Dynamical Systems In physics the Hamiltonian expresses the total energy of a system If we consider a particle in motion we have momentum described as r = dz d τ We describe the space of derivative/state as the phase space We can rewrite the probability as p ( z ) = 1 / Z p exp( − E ( z )) Acceleration / rate of change is defined as d τ = − ∂ E ( z ) dr ∂ z Kinetic energy is k ( r ) = 1 / 2 || r || 2 Henrik I. Christensen (RIM@GT) Sampling Methods – II 17 / 23

  18. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Hamiltonian model The Hamiltonian is then H ( z , r ) = E ( z ) + K ( r ) The coupled systems is then dz i ∂ H = d τ ∂ r i − ∂ H dr i = d τ ∂ z i Henrik I. Christensen (RIM@GT) Sampling Methods – II 18 / 23

  19. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Hamiltonian model The Hamiltonian is constant energy but can trade-off z and r We can control the motion of the dynamic system. As an example r could be drawn as a sample from p(z). In reality this is parallel to Newton - Rapson optimization where gradient information is used to control step size. Henrik I. Christensen (RIM@GT) Sampling Methods – II 19 / 23

  20. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Leapfrog Discritization Discretization - alternative variables r i ( τ ) − ǫ ∂ E r i ( τ + ǫ/ 2) = ( z ( τ )) 2 ∂ z i z i ( τ + ǫ ) = z i ( τ ) + ǫ r I ( τ + ǫ/ 2) r i ( τ + ǫ/ 2) − ǫ ∂ E r i ( τ + ǫ ) = ( z ( τ + ǫ )) 2 ∂ z i Henrik I. Christensen (RIM@GT) Sampling Methods – II 20 / 23

  21. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Hybrid Monte-Carlo Consider a state ( z , r ) and a updated state of ( z ∗ , r ∗ ) We could then accept the candidate when min(1 , exp( H ( z , r ) − H ( z ∗ , r ∗ ))) Given the hamiltonian is supposed to be constant a strategy is to make a ’random’ change before the leapfrog integration and then consider the update. Henrik I. Christensen (RIM@GT) Sampling Methods – II 21 / 23

  22. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Outline Introduction 1 Markov Chain Monte Carlo 2 Gibbs Sampling 3 Slice Sampling 4 Hybrid Monte-Carlo 5 Summary 6 Henrik I. Christensen (RIM@GT) Sampling Methods – II 22 / 23

  23. Introduction MCMC Gibbs Sampling Slice Sampling Hybrid MC Summary Summary MCMC is about tracking of state during sampling How can we use current estimates to update variables as iterative updating Consideration of strategies to update Metropolis - basic random walk Slicing - a way to update step sizes Gibbs Sampling - stepwise updating Hybrid MCMC - a way to integrate gradient information Henrik I. Christensen (RIM@GT) Sampling Methods – II 23 / 23

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend