dynamic bayesian networks
play

Dynamic Bayesian Networks Cora Perez-Ariza 1 , Ann Nicholson 2 , - PowerPoint PPT Presentation

Causal Discovery of Dynamic Bayesian Networks Cora Perez-Ariza 1 , Ann Nicholson 2 , Kevin Korb 2 , Steven Mascaro 2 and Chao Heng Hu 2 1 Dept. of Computer Science and Artificial Intelligence University of Granada 2 Clayton School of IT Monash


  1. Causal Discovery of Dynamic Bayesian Networks Cora Perez-Ariza 1 , Ann Nicholson 2 , Kevin Korb 2 , Steven Mascaro 2 and Chao Heng Hu 2 1 Dept. of Computer Science and Artificial Intelligence University of Granada 2 Clayton School of IT Monash University

  2. Learning Static BNs Constraint-based learning  Performs independence tests  e.g. PC algorithm (Spirtes et al., 1993)  Tests all pairs for direct dependencies

  3. Learning Static BNs Constraint-based learning  Performs independence tests  e.g. PC algorithm (Spirtes et al., 1993)  Tests all pairs for direct dependencies Finding a graph pattern Finding head to head arcs and orient the rest

  4. Learning Static BNs Metric-based learning  (Stochastic) search and score  Learning programs/packages: e.g. CaMML (Causal discovery via MML), BNT (Bayes Net Toolbox).

  5. Learning Static BNs Metric-based learning  (Stochastic) search and score  Learning programs/packages: e.g. CaMML (Causal discovery via MML), BNT (Bayes Net Toolbox). remove Score and rank add reverse M → M' : Add/remove/reverse arcs

  6. Dynamic Bayesian Networks Extension of BN with arcs from t → t + 1 DBNs that we consider: 1. Same structure for each slice (i.e. stationary) 2. Arcs cannot span more than one time step t 0 t 1 t 2

  7. Learning Dynamic Bayesian Networks Why not use existing static learners? Need to guarantee slice t nodes come before slice  t +1 nodes Often want slices to be the same (i.e. stationary)  Make the search more efficient   Produce better models

  8. Learning DBNs – Previous Approaches Friedman et al. (1998)  Uses BIC/BDe scoring  Hill-climbing  Learn the prior/initial network and the transition network The corresponding DBN Prior network Transition network +

  9. Learning DBNs – Previous Approaches Bayes Net Toolbox (BNT)  Written by Kevin Murphy (2001)  Supports DBN learning and inference BNT algorithm  Uses BIC/ML scoring  Guarantees that  Only learns arcs between slices (temporal arcs)

  10. Two New Approaches to Learning DBNs 1. Enforce stationary DBN structure with structural priors 2. Enhance existing search and score procedure to take DBN structure into account Both take advantage of our BN learner software CaMML

  11. CaMML Bayesian network learner created at Monash  Uses MML for score and MCMC for search  Can specify flexible priors:  ◦ A -> B: Direct causal connection ◦ A – B: Direct relation ◦ A => B: Ancestral relation ◦ A ~ B: Correlation ◦ Tiers ◦ Existing BN structure

  12. CaMML Bayesian network learner created at Monash  Uses MML for score and MCMC for search  Can specify flexible priors:  ◦ A -> B: Direct causal connection ◦ A – B: Direct relation ◦ A => B: Ancestral relation ◦ A ~ B: Correlation ◦ Tiers ◦ Existing BN structure

  13. CaMML Tier Priors Learning

  14. CaMML Tier Priors Learning ≺ t 0 t 1 t 2 Motivation, SES, Education ≺ Motivation1, SES1, Education1

  15. CaMML 2-Step Learning Learned DBN t 0 t 1 t 0 Data Copy Network Learn from Data Learn transitional arcs BN for t 0

  16. Experiments Test models We compare CaMML against two other learning programs: - PC algorithm (in GeNIe) - BNT

  17. Milk Infection DBN Mutual information t 0 t 1 Use mutual information to score strength of arcs

  18. BAT DBN Too many variables to show! t 1 t 0

  19. Experiment #1 Plain static BN learning (without using priors)  CaMML vs GeNIe (PC algorithm)  Experiment #2 Learning with tier priors  CaMML vs GeNIe and BNT  Experiment #3 Learning using tier priors vs 2-step algorithm 

  20. Experiment Procedure Known Models Tier or BN Priors Generate Learn Learned Models Data DBNs Test with ED/CKL

  21. Evaluation Edit distance Count 1 if an arc is missing/added/reversed in the learned model Our modification for DBNs: ED DBN = W s .N s + W t .N t Causal Kullback-Leibler divergence Computes the distance of probability distribution between model P and model Q

  22. Results Milk and cancer model (CaMML vs GeNIe, using tiers)

  23. Results Milk and cancer model (CaMML vs GeNIe, using tiers)

  24. Results Transitional arc errors for the BAT network Datasize CaMML w/ Tiers GeNIe (PC) BNT 500 6.8 (0.98) 13 7.5 (0.50) 5000 5.4 (1.62) 10 6.2 (0.74) 50000 1.0 (0.0) 10 3.7 (0.33)

  25. Results BAT model (CaMML, tiers vs 2-step learning) 35 30 25 20 total errors (tiers) Errors 15 10 total errors (Alg) 5 0 Data size 500 5000 50000

  26. Results BAT model (CaMML, tiers vs 2-step learning) 25 20 15 Errors static errors (tiers) 10 static errors (Alg) 5 0 Data size 500 5000 50000

  27. Results BAT model (CaMML, tiers vs 2-step learning) 14 12 10 8 temporal errors Errors (tiers) 6 temporal errors 4 (Alg) 2 0 500 5000 50000 Data size

  28. Summary GeNIe(PC) tends to over-fit (i.e. more arcs added) with large data size in Experiment #1. Using tiers, CaMML produces fewer errors than BNT and GeNIe(PC). CaMML can recover more weak arcs, and usually learns all the strong arcs. The 2-step learning algorithm produces comparable results, better at learning static arcs.

  29. CaMML 2-Step Learning Issues

  30. CaMML 2-Step Learning Issues

  31. Current and Future Work  Modify CaMML’s search and score: ◦ Alter score to avoid double counting static arcs ◦ Alter search to avoid invalid DBN structures ◦ Ultimately: Reduce the search space so that we can find good models more quickly

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend