sub quadratic markov tree mixture models for probability
play

Sub-quadratic Markov tree mixture models for probability density - PowerPoint PPT Presentation

Sub-quadratic mixture models Sub-quadratic Markov tree mixture models for probability density estimation Sourour Ammar 1 , Ph. Leray 1 , L. Wehenkel 2 1 Equipe COnnaissances et D ecision, LINA UMR 6241 Ecole Polytechnique de lUniversit e


  1. Sub-quadratic mixture models Sub-quadratic Markov tree mixture models for probability density estimation Sourour Ammar 1 , Ph. Leray 1 , L. Wehenkel 2 1 Equipe COnnaissances et D´ ecision, LINA UMR 6241 Ecole Polytechnique de l’Universit´ e de Nantes 2 Department of EECS and GIGA-Research, University of Li` ege COMPSTAT’2010 - Paris - 22-27 August 2010 S. Ammar et al. Sub-quadratic mixture models (1/22)

  2. Sub-quadratic mixture models A simple idea Proposition Develop density estimation techniques that could scale to very high-dimensional spaces, by exploiting the Perturb & Combine idea with probabilistic graphical models. Outline Background Our proposal Some results Conclusions and Further works S. Ammar et al. Sub-quadratic mixture models (2/22)

  3. Sub-quadratic mixture models Background Perturb and combine principle P&C principle in supervised learning Principle : (Bagging, Random forests, Extremely randomized trees) How can we apply this idea to density estimation with Bayesian networks (BN)? S. Ammar et al. Sub-quadratic mixture models (3/22)

  4. Sub-quadratic mixture models Background Perturb and combine principle P&C principle in supervised learning Principle : (Bagging, Random forests, Extremely randomized trees) Learning algorithm (research proc.) How can we apply this idea to density estimation with Bayesian networks (BN)? S. Ammar et al. Sub-quadratic mixture models (3/22)

  5. Sub-quadratic mixture models Background Perturb and combine principle P&C principle in supervised learning Principle : (Bagging, Random forests, Extremely randomized trees) Learning Perturb Weak algorithm algorithm (randomization) (research proc.) How can we apply this idea to density estimation with Bayesian networks (BN)? S. Ammar et al. Sub-quadratic mixture models (3/22)

  6. � � � Sub-quadratic mixture models Background Perturb and combine principle P&C principle in supervised learning Principle : (Bagging, Random forests, Extremely randomized trees) Learning Perturb Weak algorithm algorithm (randomization) (research proc.) Weak Weak Weak algorithm algorithm algorithm 1 m 2 How can we apply this idea to density estimation with Bayesian networks (BN)? S. Ammar et al. Sub-quadratic mixture models (3/22)

  7. � � � � � � Sub-quadratic mixture models Background Perturb and combine principle P&C principle in supervised learning Principle : (Bagging, Random forests, Extremely randomized trees) Learning Perturb Weak algorithm algorithm (randomization) (research proc.) Weak Weak Weak algorithm algorithm algorithm 1 m 2 prediction 1 prediction 2 prediction m How can we apply this idea to density estimation with Bayesian networks (BN)? S. Ammar et al. Sub-quadratic mixture models (3/22)

  8. � � � � � � Sub-quadratic mixture models Background Perturb and combine principle P&C principle in supervised learning Principle : (Bagging, Random forests, Extremely randomized trees) Learning Perturb Weak algorithm algorithm (randomization) (research proc.) Weak Weak Weak algorithm algorithm algorithm 1 m 2 prediction 1 prediction 2 prediction m Combine (Weighting schema) Final prediction How can we apply this idea to density estimation with Bayesian networks (BN)? S. Ammar et al. Sub-quadratic mixture models (3/22)

  9. � � � � � � Sub-quadratic mixture models Background Perturb and combine principle P&C principle in supervised learning Principle : (Bagging, Random forests, Extremely randomized trees) Learning Perturb Weak algorithm algorithm (randomization) (research proc.) Weak Weak Weak algorithm algorithm algorithm 1 m 2 prediction 1 prediction 2 prediction m Combine (Weighting schema) Final prediction How can we apply this idea to density estimation with Bayesian networks (BN)? S. Ammar et al. Sub-quadratic mixture models (3/22)

  10. Sub-quadratic mixture models Background Density estimation with BN Density estimation with BN A B D E C S. Ammar et al. Sub-quadratic mixture models (4/22)

  11. Sub-quadratic mixture models Background Density estimation with BN Density estimation with BN ˜ S A B D E C S. Ammar et al. Sub-quadratic mixture models (4/22)

  12. Sub-quadratic mixture models Background Density estimation with BN Density estimation with BN ˜ S A B C D E 0 0 1 0 0 0 1 1 1 1 1 0 0 0 0 A B 1 1 1 1 1 0 0 0 0 1 D E C 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 S. Ammar et al. Sub-quadratic mixture models (4/22)

  13. Sub-quadratic mixture models Background Density estimation with BN Density estimation with BN ˜ S A B C D E 0 0 1 0 0 B D 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 C 0 0 0 0 1 E 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 A 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 S. Ammar et al. Sub-quadratic mixture models (4/22)

  14. Sub-quadratic mixture models Background Density estimation with BN Density estimation with BN ˜ S A B C D E 0 0 1 0 0 B D 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 C 0 0 0 0 1 E 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 A 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 S. Ammar et al. Sub-quadratic mixture models (4/22)

  15. Sub-quadratic mixture models Background Density estimation with BN Density estimation with BN ˜ S A B C D E 0 0 1 0 0 B D 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 C 0 0 0 0 1 E 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 A 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 S. Ammar et al. Sub-quadratic mixture models (4/22)

  16. � � � Sub-quadratic mixture models Background Density estimation with BN Density estimation with BN ˜ S A B C D E 0 0 1 0 0 B D 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 C 0 0 0 0 1 E 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 A 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 S. Ammar et al. Sub-quadratic mixture models (4/22)

  17. Sub-quadratic mixture models Background Density estimation with BN Bayesian averaging Instead of searching for an optimal model (structure + parameters): Assume prior probabilities over the space of structures Determine posterior probabilities of each model given a dataset approach the target distribution by avereaging the different models wighted by their posterior probabilities Caveats : Exact Bayesian averaging over large space of models is not ‘scalable’ ⇒ requires to strongly constrain the space of structures S. Ammar et al. Sub-quadratic mixture models (5/22)

  18. Sub-quadratic mixture models Outline Background Our proposal Some results Conclusions and Further works S. Ammar et al. Sub-quadratic mixture models (6/22)

  19. Sub-quadratic mixture models Proposal Strategy Use simple spaces of graphical structures ˜ S (e.g. chains, trees, poly-trees etc.) Do not assume that target distribution is representable by one of these structures Rather, assume that target distribution may be approximated well by a mixture of a reasonable number of ( S , θ ∗ ) pairs, S ∈ ˜ S S. Ammar et al. Sub-quadratic mixture models (7/22)

  20. Sub-quadratic mixture models Proposal Generic algorithm principle ˜ S A B C D E 0 0 1 0 0 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 0 0 0 0 1 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 S. Ammar et al. Sub-quadratic mixture models (8/22)

  21. � � � � Sub-quadratic mixture models Proposal Generic algorithm principle ˜ S A B C D E 0 0 1 0 0 0 1 1 1 1 T 1 T 2 T m 1 0 0 0 0 1 1 1 1 1 0 0 0 0 1 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 S. Ammar et al. Sub-quadratic mixture models (8/22)

  22. � � � � � � Sub-quadratic mixture models Proposal Generic algorithm principle ˜ S A B C D E 0 0 1 0 0 0 1 1 1 1 T 1 T 2 T m 1 0 0 0 0 1 1 1 1 1 0 0 0 0 1 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 θ ∗ θ ∗ θ ∗ 1 2 m S. Ammar et al. Sub-quadratic mixture models (8/22)

  23. � � � � � � Sub-quadratic mixture models Proposal Generic algorithm principle ˜ S A B C D E 0 0 1 0 0 0 1 1 1 1 T 1 T 2 T m 1 0 0 0 0 1 1 1 1 1 0 0 0 0 1 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 θ ∗ θ ∗ θ ∗ 1 2 m µ 1 µ 2 µ m S. Ammar et al. Sub-quadratic mixture models (8/22)

  24. � � � � � � Sub-quadratic mixture models Proposal Generic algorithm principle ˜ S A B C D E 0 0 1 0 0 0 1 1 1 1 T 1 T 2 T m 1 0 0 0 0 1 1 1 1 1 0 0 0 0 1 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 θ ∗ θ ∗ θ ∗ 1 2 m µ 1 µ 2 µ m Final model ( θ ∗ i , µ i ) i =1 .. m S. Ammar et al. Sub-quadratic mixture models (8/22)

  25. � � � � � � Sub-quadratic mixture models Proposal Degrees of freedom ˜ S What space ˜ S ? A B C D E 0 0 1 0 0 0 1 1 1 1 T 1 T 2 T m 1 0 0 0 0 1 1 1 1 1 0 0 0 0 1 1 0 1 0 1 1 1 0 1 1 1 1 0 1 1 0 1 1 1 1 1 0 0 0 0 1 1 1 1 1 θ ∗ θ ∗ θ ∗ 1 2 m µ 1 µ 2 µ m Final model ( θ ∗ i , µ i ) i =1 .. m S. Ammar et al. Sub-quadratic mixture models (9/22)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend