47 1 introduction
play

47.1 Introduction chapter overview: 46. Introduction and - PowerPoint PPT Presentation

Foundations of Artificial Intelligence May 24, 2017 47. Uncertainty: Representation Foundations of Artificial Intelligence 47.1 Introduction 47. Uncertainty: Representation 47.2 Conditional Independence Malte Helmert and Gabriele R oger


  1. Foundations of Artificial Intelligence May 24, 2017 — 47. Uncertainty: Representation Foundations of Artificial Intelligence 47.1 Introduction 47. Uncertainty: Representation 47.2 Conditional Independence Malte Helmert and Gabriele R¨ oger 47.3 Bayesian Networks University of Basel May 24, 2017 47.4 Summary M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 1 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 2 / 24 47. Uncertainty: Representation Introduction Uncertainty: Overview 47.1 Introduction chapter overview: ◮ 46. Introduction and Quantification ◮ 47. Representation of Uncertainty M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 3 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 4 / 24

  2. 47. Uncertainty: Representation Introduction 47. Uncertainty: Representation Introduction Running Example Full Joint Probability Distribution: Discussion We continue the dentist example. Advantage: Contains all necessary information toothache ¬ toothache Disadvantage: Prohibitively large in practice: catch ¬ catch catch ¬ catch Table for n Boolean variables has size O (2 n ). cavity 0 . 108 0 . 012 0 . 072 0 . 008 ¬ cavity 0 . 016 0 . 064 0 . 144 0 . 576 Good for theoretical foundations, but what to do in practice? M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 5 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 6 / 24 47. Uncertainty: Representation Conditional Independence 47. Uncertainty: Representation Conditional Independence Reminder: Bayes’ Rule General version with multivalued variables and conditioned on 47.2 Conditional Independence some background evidence e : P ( Y | X , e ) = P ( X | Y , e ) P ( Y | e ) P ( X | e ) M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 7 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 8 / 24

  3. 47. Uncertainty: Representation Conditional Independence 47. Uncertainty: Representation Conditional Independence Multiple Evidence Conditional Independence: Example toothache ¬ toothache If we already know that the probe catches and the tooth aches, catch ¬ catch catch ¬ catch we could compute the probability that this patient has cavity from cavity 0 . 108 0 . 012 0 . 072 0 . 008 P ( Cavity | catch ∧ toothache ) ¬ cavity 0 . 016 0 . 064 0 . 144 0 . 576 = α P ( catch ∧ toothache | Cavity ) P ( Cavity ) . Variables Toothache and Catch not independent Problem: Need conditional probability for catch ∧ toothache but independent given the presence or absence of cavity: Problem: for each value of Cavity . Problem: � same scalability problem as with full joint distribution P ( Toothache , Catch | Cavity ) = P ( Toothache | Cavity ) P ( Catch | Cavity ) M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 9 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 10 / 24 47. Uncertainty: Representation Conditional Independence 47. Uncertainty: Representation Conditional Independence Conditional Independence Conditional Independence and Multiple Evidence Example Multiple evidence: Definition P ( Cavity | catch ∧ toothache ) Two variables X and Y are conditionally independent given a third variable Z if = α P ( catch ∧ toothache | Cavity ) P ( Cavity ) = α P ( toothache | Cavity ) P ( catch | Cavity ) P ( Cavity ) . P ( X , Y | Z ) = P ( X | Z ) P ( Y | Z ) . � No need for conditional joint probabilities for conjunctions M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 11 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 12 / 24

  4. 47. Uncertainty: Representation Conditional Independence 47. Uncertainty: Representation Bayesian Networks Conditional Independence: Decomposition of Joint Dist. Full joint distribution: 47.3 Bayesian Networks P ( Toothache , Catch , Cavity ) = P ( Toothache , Catch | Cavity ) P ( Cavity ) = P ( Toothache | Cavity ) P ( Catch | Cavity ) P ( Cavity ) � Large table can be decomposed into three smaller tables. For n symptoms that are all conditionally independent given Cavity the representation grows as O ( n ) instead of O (2 n ). M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 13 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 14 / 24 47. Uncertainty: Representation Bayesian Networks 47. Uncertainty: Representation Bayesian Networks Bayesian Networks Bayesian Network: Example Definition P ( B ) P ( E ) Burglary Earthquake .001 .002 A Bayesian network is a directed acyclic graph, where ◮ each node corresponds to a random variable, B E P ( A ) ◮ each node X has an associated t t .95 Alarm t f .94 conditional probability distribution P ( X | parents ( X )) f t .29 f f .001 that quantifies the effect of the parents on the node. Bayesian networks are also called belief networks A P ( J ) A P ( M ) or probabilistic networks. MaryCalls JohnCalls t .9 0 t .7 0 f .0 5 f .0 1 They are a subclass of graphical models. M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 15 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 16 / 24

  5. 47. Uncertainty: Representation Bayesian Networks 47. Uncertainty: Representation Bayesian Networks Semantics Naive Construction The semantics for Bayesian networks expresses that ◮ the information associated to each node represents a conditional probability distribution, and that Order all variables, e.g.. as X 1 , . . . , X n . ◮ each variable is conditionally independent For i = 1 to n do: of its non-descendants given its parents. ◮ Choose from X 1 , . . . , X i − 1 a minimal set of parents of X i such that P ( X i | X i − 1 , . . . , X 1 ) = P ( X i = x i | parents ( X i )). Definition ◮ For each parent insert a link from the parent to X i . A Bayesian network with nodes { X 1 , . . . , X n } represents ◮ Define conditional probability table P ( X i | parents ( X i )). the full joint probability given by n � P ( X 1 = x 1 ∧ · · · ∧ X n = x n ) = P ( X i = x i | parents ( X i )) . i =1 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 17 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 18 / 24 47. Uncertainty: Representation Bayesian Networks 47. Uncertainty: Representation Bayesian Networks Compactness Influence of Node Ordering A bad node ordering can lead to large numbers of parents and probabiliy distributions that are hard to specify. Compactness of Bayesian networks stems from local structures in domains, where random variables are MaryCalls MaryCalls directly influenced only by a small number of variables. JohnCalls JohnCalls ◮ n Boolean random variables ◮ each variable directly influenced by at most k others Earthquake Alarm ◮ full joint probability distribution contains 2 n numbers ◮ Bayesian network can be specified by n 2 k numbers Burglary Burglary Alarm Earthquake (a) (b) M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 19 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 20 / 24

  6. 47. Uncertainty: Representation Bayesian Networks 47. Uncertainty: Representation Bayesian Networks Conditional Independence Given Parents Conditional Independence Given Markov Blanket Each variable is conditionally independent of its non-descendants The Markov blanket of a node consists given its parents. of its parents, children and children’s other parents. . . . U 1 U m U 1 U m . . . Each variable is X conditionally independent Z 1 j Z nj X of all other nodes in the Z 1 j Z nj network given its Markov Y Y n . . . blanket (gray area). 1 Y 1 Y n . . . X is conditionally independent of the nodes Z ij given U 1 . . . U m . M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 21 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 22 / 24 47. Uncertainty: Representation Summary 47. Uncertainty: Representation Summary Summary & Outlook Summary ◮ Conditional independence is weaker than (unconditional) independence but occurs more frequently. 47.4 Summary ◮ Bayesian networks exploit conditional independence to compactly represent joint probability distributions. Outlook ◮ There are exact and approximate inference algorithms for Bayesian networks. ◮ Exact inference in Bayesian networks is NP-hard (but tractable for some sub-classes such as poly-trees). ◮ All concepts can be extended to continuous random variables. M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 23 / 24 M. Helmert, G. R¨ oger (University of Basel) Foundations of Artificial Intelligence May 24, 2017 24 / 24

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend