diverse particle selection for high dimensional inference
play

Diverse Particle Selection for High-Dimensional Inference in - PowerPoint PPT Presentation

Diverse Particle Selection for High-Dimensional Inference in Graphical Models Erik Sudderth UC Irvine Computer Science Collaborators: Particle Max-Product: Jason Pacheco, MIT Human Pose: Silvia Zuffi & Michael Black, MPI Tubingen


  1. Diverse Particle Selection for High-Dimensional Inference in Graphical Models Erik Sudderth UC Irvine Computer Science Collaborators: Ø Particle Max-Product: Jason Pacheco, MIT Ø Human Pose: Silvia Zuffi & Michael Black, MPI Tubingen Related papers at ICML 2014 & ICML 2015

  2. High-Dimensional Inference Data Unknowns Estimate Probability Model Discrete Efficient inference based on Unknowns combinatorial optimization Unless we make unrealistic model Continuous approximations, no efficient general Unknowns solutions. Standard gradient-based optimization is ineffective.

  3. Continuous Inference Problems Human pose estimation & tracking Protein structure & side chain prediction Robot motion & vehicle path planning

  4. Maximum a Posteriori (MAP) Data Unknowns Posterior MAP Estimate * Posterior often intractable and multimodal complicating exact MAP inference:

  5. Maximum a Posteriori (MAP) Data Unknowns Posterior Local Optimum * * Posterior often intractable and multimodal complicating exact MAP inference: Local optima can be useful when models are inaccurate or data are noisy.

  6. Goal Develop maximum a posteriori (MAP) inference algorithms for continuous probability models that: Ø Apply to any pairwise graphical model, even if model is complex (highly non-Gaussian) Ø Are black-box (no gradients required) Ø Will reliably infer multiple local optima

  7. Pairwise Graphical Models x s ∈ R d Ø Nodes are continuous random variables Ø Potentials encode statistical relationships Ø Edges indicate direct, pairwise energetic interactions x 1 x 4 x 2 x 8 x 3 x 9 x 7 x 5 x 6

  8. Message Passing on Trees Global MAP inference decomposes into local computations via graph structure…

  9. Max-Product Belief Propagation Max-Product Belief Propagation Finding max-marginals via message-passing Y q s ( x s ) = max x t 6 = s p ( x s , x t 6 = s ) ∝ ψ s ( x s ) m ts ( x s ) t 2 Γ ( s ) Why max-marginals? Ø Directly encode global MAP Ø Other modes important: models approximate, data uncertain Max-product dynamic programming finds exact max-marginals on tree-structured graphs.

  10. Articulated Pose Estimation [ Zuffi et al., CVPR 2012 ] Complicated Non-Gaussian Likelihood Compatibility Deformable Structures (DS): Continuous state for part shape , location , orientation, scale . PCA Shape

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend