inference in belief networks
play

Inference in Belief Networks CMPUT 366: Intelligent Systems - PowerPoint PPT Presentation

Inference in Belief Networks CMPUT 366: Intelligent Systems P&M 8.4 Lecture Outline 1. Recap 2. Factors 3. Variable Elimination 4. Efficiency Recap: Belief Networks Definition: A belief network (or Bayesian network )


  1. 
 Inference in Belief Networks CMPUT 366: Intelligent Systems 
 P&M §8.4

  2. Lecture Outline 1. Recap 2. Factors 3. Variable Elimination 4. Efficiency

  3. Recap: Belief Networks Definition: 
 A belief network (or Bayesian network ) consists of: 1. A directed acyclic graph, with each node labelled by a random variable 2. A domain for each random variable 3. A conditional probability table for each variable given its parents • The graph represents a specific factorization of the full joint distribution • Semantics: 
 Every node is independent of its non-descendants , conditional on its parents

  4. Recap: Queries • The most common task for a belief network is to query posterior probabilities given some observations Tampering Fire • Easy cases: • Posteriors of a single variable conditional only on parents Alarm Smoke Smoke • Joint distributions of variables early in a compatible variable ordering Leaving • Typically, the observations have no straightforward relationship to the target • This lecture: mechanical procedure for computing arbitrary Report Report queries

  5. Factors • The Variable Elimination algorithm exploits the factorization of a joint probability distribution encoded by a belief network in order to answer queries • A factor is a function f ( X 1 ,..., X k ) from random variables to a real number • Input: factors representing the conditional probability tables from the belief network's chain rule decomposition. Pr(Leaving|Alarm)Pr(Smoke|Fire)Pr(Alarm|Tampering,Fire)Pr(Tampering)Pr(Fire) becomes f 1 (Leaving, Alarm) f 2 (Smoke,Fire) f 3 (Alarm,Tampering,Fire) f 4 (Tampering) f 5 (Fire) • Output: A new factor encoding the target posterior distribution

  6. 
 Conditional Probabilities as Factors • A conditional probability P(Y | X 1 ,..., X n ) is a factor f( Y , X 1 ,..., X n ) that obeys the constraint : ∑ ∀ v 1 ∈ dom ( X 1 ), v 2 ∈ dom ( X 2 ), …, v n ∈ dom ( X n ) : f ( y , v 1 , …, v n ) = 1 y ∈ dom ( Y ) • Answer to a query is a factor constructed by applying operations to the input factors • Operations on factors are not guaranteed to maintain this constraint! • Solution: Don't sweat it ! • Operate on unnormalized probabilities during the computation • Normalize at the end of the algorithm to re-impose the constraint

  7. Conditioning • Conditioning is an operation on a single factor • Constructs a new factor that returns the values of the original factor with some of its inputs fixed Definition: 
 For a factor f 1 ( X 1 ,..., X k ), conditioning on X i = v i yields a new factor f 2 ( X 1 ,... X i -1 , X i +1 ,..., X k ) = ( f 1 ) Xi = vi such that for all values v 1 ,..., v i -1 , v i +1 ,..., v k in the domain of X 1 ,... X i -1 , X i +1 ,..., X k, f 2 ( v 1 ,..., v i -1 , v i +1 ,..., v k ) = f 1 ( v 1 ,..., v i -1 , v i , v i +1 ,..., v k ).

  8. Conditioning Example f 2 ( A,B ) = f 1 ( A,B,C ) C =true A B C value A B value F F F 0.1 F F 0.88 F F T 0.88 F T 0.45 F T F 0.12 T F 0.66 F T T 0.45 T T 0.25 T F F 0.7 T F T 0.66 T T F 0.1 T T T 0.25

  9. Multiplication • Multiplication is an operation on two factors • Constructs a new factor that returns the product of the rows selected from each factor by its arguments Definition: 
 For two factors f 1 ( X 1 ,..., X j ,Y 1 ,...,Y k ) and f 2 ( Y 1 ,...,Y k, Z 1 ,..., Z ℓ ), 
 multiplication of f 1 and f 2 yields a new factor ( f 1 ⨉ f 2 ) = f 3 ( X 1 ,..., X j ,Y 1 ,...,Y k, Z 1 ,..., Z ℓ ) such that for all values x 1 ,..., x j ,y 1 ,...,y k, z 1 ,..., z ℓ , f 3 ( x 1 ,..., x j ,y 1 ,...,y k, z 1 ,..., z ℓ ) = f 1 ( x 1 ,..., x j ,y 1 ,...,y k ) f 2 ( y 1 ,...,y k, z 1 ,..., z ℓ ).

  10. Multiplication Example f 3 (A,B,C) = f 1 ( A,B ) ⨉ f 2 ( B,C ) A B C value F F F 0.1 A B value B C value F F T 0 F F 0.1 F F 1.0 F T F 0.1 F T 0.2 F T 0 F T T 0.05 T F 0.3 T F 0.5 T F F 0.3 T T 0.4 T T 0.25 T F T 0 T T F 0.2 T T T 0.1

  11. Summing Out • Summing out is an operation on a single factor • Constructs a new factor that returns the sum over all values of a random variable of the original factor Definition: 
 For a factor f 1 ( X 1 ,..., X k ), summing out a variable X i yields a new factor 
 ∑ f 2 ( X 1 , …, X i − 1 , X i +1 , …, X k ) = f 1 X i such that for all values v 1 ,..., v i -1 , v i +1 ,..., v k in the domain of X 1 ,... X i -1 , X i +1 ,..., X k, ∑ ( v 1 , …, v i − 1 , v i , v i +1 , …, v k ) f 2 ( v 1 , …, v i − 1 , v i +1 , …, v k ) = v i ∈ dom ( X i )

  12. Summing Out Example f 2 ( B ) = ∑ A f 1 ( A,B ) A B value F F 0.1 B value F T 0.2 F 0.4 T 0.6 T F 0.3 T T 0.4

  13. Variable Elimination • Given observations Y 1 = v 1 ,.., Y k = v k and query variable Q , we want P ( Q , Y 1 = v 1 , …, Y k = v k ) P ( Q ∣ Y 1 = v 1 , …, Y k = v k ) = ∑ q ∈ dom ( Q ) P ( Q = q , Y 1 = v 1 , …, Y k = v k ) • Basic idea of variable elimination: 1. Condition on observations by conditioning 2. Construct joint distribution factor by multiplication 3. Remove unwanted variables (neither query nor observed) by summing out 4. Normalize at the end • Doing these steps in order is correct but not efficient • Efficiency comes from interleaving the order of operations

  14. Sums of Products 2. Construct joint distribution factor by multiplication 3. Remove unwanted variables (neither query nor observed) by summing out The computationally intensive part of variable elimination is computing sums of products Example : multiply factors f 1 ( Q,A,B,C ), f 2 ( C,D,E ); sum out A,E f 3 ( Q , A , B , C , D , E ) = f 1 ( Q , A , B , C ) × f 2 ( C , D , E ) : 2 6 multiplications 1. f 4 ( Q , A , B ) = ∑ 2. f 3 ( Q , A , B , C , D , E ) : ∼ 2 3 additions A , E Total: about 72 computations

  15. Efficient Sums of Products We can reduce the number of computations required by changing their order . ∑ A ∑ f 1 ( Q , A , B , C ) × f 2 ( C , D , E ) E = ( ∑ f 1 ( Q , A , B , C ) ) × ( ∑ f 2 ( C , D , E ) ) 
 A E f 3 ( C , D ) = ∑ 1. f 2 ( C , D , E ) : ∼ 2 2 additions f 4 ( Q , B , C ) = ∑ E f 1 ( Q , A , B , C ) : ∼ 2 3 additions 2. A f 5 ( Q , B , C , D ) = f 3 ( Q , B , C ) × f 4 ( B , C , D ) : 2 4 multiplications 3. Total: about 28 computations

  16. Variable Elimination Algorithm Input : query variable Q ; set of variables Vs ; observations O ; factors Ps representing conditional probability tables Fs := Ps 
 for each X in Vs \ { Q } according to some elimination ordering : 
 Rs = { F in Fs | F involves X } 
 if X is observed: 
 for each F in Rs : 
 F' = F conditioned on observed value of X 
 Fs = Fs \ { F } ⋃ { F' } 
 else : 
 T := product of factors in Rs 
 N := sum X out of T 
 Fs := Fs \ Rs ⋃ { N } 
 T := product of factors in Fs 
 N := sum Q out of T 
 return T / N

  17. Variable Elimination Example: Tampering Fire Conditioning Alarm Smoke Leaving Query: P(Tampering | Smoke=true, Report=true) 
 Variable ordering: Smoke, Report, Fire, Alarm, Leaving Report P(Tampering, Fire, Alarm, Smoke, Leaving, Report) = 
 P(Tampering)P(Fire)P(Alarm|Tampering,Fire)P(Smoke|Fire)P(Leaving|Alarm)P(Report|Leaving) Construct factors for each table: 
 { f 0 (Tampering), f 1 (Fire), f 2 (Tampering,Alarm,Fire), f 3 (Smoke,Fire), f 4 (Leaving,Alarm), f 5 (Report,Leaving) } Condition on Smoke: f 6 = ( f 3 ) Smoke=true 
 { f 0 (Tampering), f 1 (Fire), f 2 (Tampering,Alarm,Fire), f 6 (Fire), f 4 (Leaving,Alarm), f 5 (Report,Leaving) } Condition on Report: f 7 = ( f 5 ) Report=true 
 { f 0 (Tampering), f 1 (Fire), f 2 (Tampering,Alarm,Fire), f 6 (Fire), f 4 (Leaving,Alarm), f 7 (Leaving) }

  18. Variable Elimination Example: 
 Tampering Fire Elimination Alarm Smoke Leaving Query: P(Tampering | Smoke=true, Report=true) 
 Report Variable ordering: Smoke, Report, Fire, Alarm, Leaving 
 { f 0 (Tampering), f 1 (Fire), f 2 (Tampering,Alarm,Fire), f 6 (Fire), f 4 (Leaving,Alarm), f 7 (Leaving) } Sum out Fire from product of f 1 , f 2 , f 6 : f 8 = ∑ Fire ( f 1 ⨉ f 2 ⨉ f 6 ) 
 { f 0 (Tampering), f 8 (Tampering,Alarm), f 4 (Leaving,Alarm), f 7 (Leaving) } Sum out Alarm from product of f 8 , f 4 : f 9 = ∑ Alarm ( f 8 ⨉ f 4 ) 
 { f 0 (Tampering), f 9 (Tampering,Leaving), f 7 (Leaving) } Sum out Leaving from product of f 9 , f 7 : f 10 = ∑ Leaving ( f 9 ⨉ f 7 ) 
 { f 0 (Tampering), f 10 (Tampering) }

  19. Variable Elimination Example: Tampering Fire Normalization Alarm Smoke Leaving Report Query: P(Tampering | Smoke=true, Report=true) 
 Variable ordering: Smoke, Report, Fire, Alarm, Leaving 
 { f 0 (Tampering), f 10 (Tampering) } Product of remaining factors: f 11 = f 0 ⨉ f 10 
 { f 11 (Tampering) } Normalize by division: 
 query(Tampering) = f 11 (Tampering) / ( ∑ Tampering f 11 (Tampering))

  20. Optimizing Elimination Order • Variable elimination exploits efficient sums of products on a factored joint distribution • The elimination order of the variables affects the efficiency of the algorithm • Finding an optimal elimination ordering is NP-hard • Heuristics (rules of thumb) for good orderings: • Min-factor: At every stage, select the variable that constructs the smallest new factor • Problem-specific heuristics

  21. Optimization: Pruning • The structure of the graph can allow us to drop leaf nodes that are neither observed nor queried • Summing them out for free • We can repeat this process: Tampering Fire Smoke Alarm Leaving Traffic Report Restaurants Full

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend