ece 6504 advanced topics in machine learning
play

ECE 6504: Advanced Topics in Machine Learning Probabilistic - PowerPoint PPT Presentation

ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics Bayes Nets: Inference Marginals, MPE, MAP Variable Elimination Readings: KF 9.1,9.2; Barber 5.1 Dhruv Batra Virginia Tech


  1. ECE 6504: Advanced Topics in Machine Learning Probabilistic Graphical Models and Large-Scale Learning Topics – Bayes Nets: Inference – Marginals, MPE, MAP – Variable Elimination Readings: KF 9.1,9.2; Barber 5.1 Dhruv Batra Virginia Tech

  2. Administrativia • HW1 – Out – Due in 2 weeks: Feb 17, Feb 19, 11:59pm – Please please please please start early – Implementation: TAN, structure + parameter learning – Please post questions on Scholar Forum. • HW2 – Out soon – Due in 2 weeks: Mar 5, 11:59pm • Project Proposal – Due: Mar 12, 11:59pm – <=2pages, NIPS format (C) Dhruv Batra 2

  3. Recap of Last Time (C) Dhruv Batra 3

  4. Learning Bayes nets Known structure Unknown structure Fully observable Very easy Hard data Missing data Somewhat easy Very very hard (EM) Data CPTs – x (1) P(X i | Pa Xi ) … x (m) structure parameters (C) Dhruv Batra Slide Credit: Carlos Guestrin 4

  5. Main Issues in PGMs • Representation – How do we store P(X 1 , X 2 , … , X n ) – What does my model mean/imply/assume? (Semantics) • Learning – How do we learn parameters and structure of P(X 1 , X 2 , … , X n ) from data? – What model is the right for my data? • Inference – How do I answer questions/queries with my model? such as – Marginal Estimation: P(X 5 | X 1 , X 4 ) – Most Probable Explanation: argmax P(X 1 , X 2 , … , X n ) (C) Dhruv Batra 5

  6. Plan for today • BN Inference – Queries: Marginals, Conditional Probabilities, MAP, MPE – Variable Elimination (C) Dhruv Batra 6

  7. Example • HW1 Inference: Tree-Augmented Naïve Bayes (TAN) (C) Dhruv Batra 7

  8. Possible Queries Flu Allergy • Evidence: E = e (e.g. N=t) • Query variables of interest Y Sinus Nose=t Headache • Conditional Probability: P( Y | E = e ) – E.g. P(F,A | N=t) – Special case: Marginals P(F) • Maximum a Posteriori: argmax P(All variables | E = e ) – argmax_{f,a,s,h} P(f,a,s,h | N = t) Old-school terminology: MPE • Marginal-MAP: argmax_y P( Y | E = e ) Old-school terminology: MAP – = argmax_{y} Σ o P( Y = y , O = o | E = e ) (C) Dhruv Batra 8

  9. Car starts BN • 18 binary attributes • Inference – P(BatteryAge|Starts=f) • 2 18 terms, why so fast? (C) Dhruv Batra Slide Credit: Carlos Guestrin 9

  10. Application: Computer Vision Grid model Semantic Markov random field segmentation (blue nodes) (C) Dhruv Batra Image Credit: Simon JD Prince 10

  11. Application: Computer Vision Parsing the human body Tree model (C) Dhruv Batra Image Credit: Simon JD Prince 11

  12. Application: Coding Observed Bits True Bits Parity Constraints (C) Dhruv Batra 12

  13. Application: Medical Diagnosis (C) Dhruv Batra Image Credit: Erik Sudderth 13

  14. Are MAP and Max of Marginals Consistent? Sinus Nose P(N|S) P(S=f)=0.6 P(S=t)=0.4

  15. Hardness • Find P(All variables) Easy for BN: O(n) • MAP – Find argmax P(All variables | E = e ) NP-hard – Find any assignment P(All variables | E = e ) > p NP-hard • Conditional Probability / Marginals – Is P(Y=y | E = e ) > 0 NP-hard – Find P(Y=y | E = e ) #P-hard – Find |P(Y=y | E = e ) – p| <= ε NP-hard for any ε <0.5 • Marginal-MAP – Find argmax_{y} Σ o P( Y = y , O = o | E = e ) NP PP -hard (C) Dhruv Batra 15

  16. Inference in BNs hopeless? • In general, yes! – Even approximate! • In practice – Exploit structure – Many effective approximation algorithms • some with guarantees • Plan – Exact Inference – Transition to Undirected Graphical Models (MRFs) – Approximate inference in the unified setting

  17. Algorithms • Conditional Probability / Marginals – Variable Elimination – Sum-Product Belief Propagation – Sampling: MCMC • MAP – Variable Elimination – Max-Product Belief Propagation – Sampling MCMC – Integer Programming • Linear Programming Relaxation – Combinatorial Optimization (Graph-cuts) (C) Dhruv Batra 17

  18. Marginal Inference Example • Evidence: E = e (e.g. N=t) Flu Allergy • Query variables of interest Y Sinus Nose=t Headache • Conditional Probability: P( Y | E = e ) – P(F | N=t) – Derivation on board (C) Dhruv Batra 18

  19. Marginal Inference Example Flu Allergy Sinus Nose=t Headache Inference seems exponential in number of variables! Actually, inference in graphical models is NP-hard L L (C) Dhruv Batra Slide Credit: Carlos Guestrin 19

  20. Variable elimination algorithm • Given a BN and a query P( Y | e ) ≈ P( Y , e ) IMPORTANT!!! • Choose an ordering on variables, e.g., X 1 , … , X n • For i = 1 to n, If X i ∉ { Y , E } – Collect factors f 1 , … ,f k that include X i – Generate a new factor by eliminating X i from these factors – Variable X i has been eliminated! • Normalize P( Y , e ) to obtain P( Y | e ) (C) Dhruv Batra Slide Credit: Carlos Guestrin 20

  21. Complexity of variable elimination – Graphs with loops Exponential in number of variables in largest factor generated (C) Dhruv Batra Slide Credit: Carlos Guestrin 21

  22. Pruning irrelevant variables Flu Allergy Sinus Nose=t Headache Prune all non-ancestors of query variables More generally: Prune all nodes not on active trail between evidence and query vars

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend