graphical models graphical models
play

Graphical Models Graphical Models Clique trees & Belief - PowerPoint PPT Presentation

Graphical Models Graphical Models Clique trees & Belief Propagation Siamak Ravanbakhsh Fall 2019 Learning objectives Learning objectives message passing on clique trees its relation to variable elimination two different forms of belief


  1. Graphical Models Graphical Models Clique trees & Belief Propagation Siamak Ravanbakhsh Fall 2019

  2. Learning objectives Learning objectives message passing on clique trees its relation to variable elimination two different forms of belief propagation

  3. Recap Recap: variable elimination (VE) : variable elimination (VE) marginalize over a subset - e.g., P ( J ) = expensive to calculate (why?) P ( C , D , I , G , S , L , J , H ) ∑ C , D , I , G , S , L , H use the factorized form of P P ( D ∣ C ) P ( G ∣ D , I ) P ( S ∣ I ) P ( L ∣ G ) P ( J ∣ L , S ) P ( H ∣ G , J ) ∑ C , D , I , G , S , L , H

  4. Recap Recap: variable elimination (VE) : variable elimination (VE) marginalize over a subset - e.g., P ( J ) = expensive to calculate (why?) P ( J , H , C , D , I , G , S , L ) ∑ C , D , I , G , S , L , H use the factorized form of P P ( D ∣ C ) P ( G ∣ D , I ) P ( S ∣ I ) P ( L ∣ G ) P ( J ∣ L , S ) P ( H ∣ G , J ) ∑ C , D , I , G , S , L , H ( H , G , J ) ϕ 2 think of this as a factor/potential same treatment of Bayes-nets Markov nets for inference note that they do not encode the same CIs

  5. Recap Recap: variable elimination (VE) : variable elimination (VE) marginalize over a subset - e.g., P ( J ) = expensive to calculate (why?) P ( J , H , C , D , I , G , S , L ) ∑ C , D , I , G , S , L , H use the factorized form of P ( D , C ) ϕ ( G , D , I ) ϕ ( S , I ) ϕ ( L , G ) ϕ ( J , L , S ) ϕ ( H , G , J ) ∑ C , D , I , G , S , L ϕ 1 2 3 4 5 6 = .... ( S ∣ I ) ( G , D , I ) ( D , C ) ∑ I ∑ D ∑ C ϕ ϕ ϕ 3 2 1 repeat this ( D , C ) ′ ψ ( D ) 1 ψ 1 ′ = .... ∑ I ( S , I ) ∑ D ( G , D , I ) ψ ( D ) ϕ ϕ 3 2 1 ′ ( G , I ) ( G , I , D ) ψ ψ 2 2

  6. Recap Recap: variable elimination (VE) : variable elimination (VE) marginalize over a subset - e.g., P ( J ) = expensive to calculate (why?) ∑ C , D , I , G , S , L , H P ( C , D , I , G , S , L , J , H ) eliminate variables in some order (order of factors in the summation) C D I

  7. Recap Recap: variable elimination (VE) : variable elimination (VE) eliminate variables in some order creates a chordal graph maximal cliques are factors created during VE ( ψ ) t chordal graph P ( J )? order: max-cliques C,D,I,H,G,S,L

  8. Clique-tree Clique-tree summarize the VE computation using a clique-tree order: C,D,I,H,G,S,L cluster clusters are maximal cliques (scope of factors created during VE) = Scope [ ψ ] C i i

  9. Clique-tree Clique-tree summarize the VE computation using a clique-tree order: sepset C,D,I,H,G,S,L cluster clusters are maximal cliques (factors that are marginalized) = Scope [ ψ ] C i i sepsets are the result of marginalization over cliques S ′ = Scope [ ψ ] i , j i = ∩ S C C i , j i j

  10. Clique-tree: Clique-tree: properties properties a tree from clusters and sepsets S = ∩ T C C C i , j i i j family-preserving property: α ( ϕ ) = j each factor is associated with a cluster s.t. ϕ Scope [ ϕ ] ⊆ C C j j

  11. Clique-tree: Clique-tree: properties properties a tree from clusters and sepsets S = ∩ T C C C i , j i i j family-preserving property: α ( ϕ ) = j each factor is associated with a cluster s.t. ϕ Scope [ ϕ ] ⊆ C C j j running intersection property: if then for in the path → … → C X ∈ C , C X ∈ C C C i j i j k k

  12. VE as VE as message passing message passing think of VE as sending messages

  13. VE as VE as message passing message passing think of VE as sending messages calculate the product of factors in each clique ) ≜ ( C ∏ ϕ : α ( ϕ )= i ψ ϕ i i send messages from the leaves towards a root: ( S ) = ( C ) ( S ) ∑ C i ∏ k ∈ Nb δ ψ δ i → j i , j k → i i , k i − S − j i , j i i neighbours

  14. message passing message passing think of VE as sending messages send messages from the leaves towards a root: ( S ) = ( C ) ( S ) ∑ C i ∏ k ∈ Nb δ ψ δ i → j i , j k → i i , k − S i − j i , j i i = ∑ V ≺( i → j ) ∏ ϕ ∈ F ϕ ≺( i → j ) all variable on i side of the tree all the factors on i side of the tree the message is the marginal from one side of the tree

  15. message passing message passing think of VE as sending messages send messages from the leaves towards a root: ) ≜ ( S ( C ) ( S ) ∑ C i ∏ k ∈ Nb δ ψ δ i → j i , j k → i i , k − S i − j i , j i i the belief at the root clique is β ) ≜ ( C ( C ) ( S ) r ∏ k ∈ Nb ψ δ k → r r , k r r r r proportional to the marginal P ( X ) ( C ) ∝ ∑ X − C β r r i

  16. message passing: message passing: root-to-leaves root-to-leaves what if we continue sending messages? (from the root to leaves) root clique i sends a message to clique j when received messages from all the other neighbors k

  17. message passing: message passing: root-to-leaves root-to-leaves root what if we continue sending messages? (from the root to leaves) sum-product belief propagation (BP) ( S ) = ( C ) ( S ) ∑ C i ∏ k ∈ Nb δ ψ δ i → j i , j k → i i , k i − S − j i , j i i ) ≜ ( S ( S ) δ ( S ) μ δ i , j i , j i → j i , j j → i i , j marginals ) ≜ ( C ( C ) ( S ) i ∏ k ∈ Nb β ψ δ k → i i , k i i i for any clique (not only root) i

  18. Summery so far... Summery so far... VE creates a chordal induced graph maximum cliques in this graph: clusters message passing view of VE: send messages between clusters towards a root going beyond VE: send messages back from the root produce marginal over all clusters

  19. Clique-tree: Clique-tree: calibration calibration ~ ∏ i ∏ i i ∏ k → i β ψ δ represent P using marginals: = k → i = = i ∏ i ψ P i ∏ i , j ∈ E ∏ i , j ∈ E μ δ δ i , j i → j j → i

  20. Clique-tree: Clique-tree: calibration calibration ~ ∏ i ∏ i i ∏ k → i β ψ δ represent P using marginals: = k → i = = i ∏ i ψ P i ∏ i , j ∈ E ∏ i , j ∈ E μ δ δ i , j i → j j → i an arbitrary assignment to all is calibrated iff , μ β i , j i ( S ) = ∑ C ( C ) = ∑ C ( C ) μ β β i , j i , j BP produces calibrated beliefs − S i i − S j j i , j i , j i j

  21. Clique-tree: Clique-tree: calibration calibration ~ ∏ i ∏ i i ∏ k → i β ψ δ represent P using marginals: = k → i = = i ∏ i ψ P i ∏ i , j ∈ E ∏ i , j ∈ E μ δ δ i , j i → j j → i an arbitrary assignment to all is calibrated iff , μ β i , j i ( S ) = ∑ C ( C ) = ∑ C ( C ) μ β β i , j i , j BP produces calibrated beliefs − S i i − S j j i , j i , j i j being calibrated and means that all are marginals , μ β i , j i ~ ⇔ ( C ) ∏ β ( X ) ∝ ( C ) ∝ P ( C ) i i P β i i i ∏ i , j ∈ E ( S ) μ i , j i , j

  22. BP: an alternative update BP: an alternative update approach 1. message update ( S ) = ( C ) ( S ) ∑ C i ∏ k ∈ Nb δ ψ δ i → j i , j k → i i , k i − S − j i , j i i calculate the beliefs in the end ( C ) = ( C i ∏ k ∈ Nb ) ( S ) β ψ δ k → i i , k i i i i Update the beliefs so that: they are calibrated they satisfy

  23. BP: an alternative update BP: an alternative update approach 1. message update ( S ) = ( C ) ( S ) ∑ C i ∏ k ∈ Nb δ ψ δ i → j i , j k → i i , k − S i − j i , j i i calculate the beliefs in the end ( C ) = ( C i ∏ k ∈ Nb ) ( S ) β ψ δ k → i i , k i i i i approach 2. belief update idea Update the beliefs so that: they are calibrated ( S ) = ( C ) = ( C ) ∑ C ∑ C μ β β i , j i , j − S i i − S j j i , j i , j i j they satisfy ∏ i β = ∏ i i ψ i ∏ i , j ∈ E μ i , j

  24. BP: an alternative update BP: an alternative update belief update ← = ∏ ϕ : α ( ϕ )= i ϕ , ← 1 initialize β ψ μ i , j i i until convergence: ( i , j ) ∈ E pick some ^ i , j ← ∑ C μ β ^ i , j = new j → i // μ δ δ i − S i → j i , j i new new ^ i , j μ ^ i , j δ δ δ ← μ j → i β β = i → j = i → j j j μ // i , j old old μ δ δ δ i , j j → i i → j i → j ← ^ i , j μ μ i , j at convergence, beliefs are calibrated ( C ) = ( C ) ∑ C ∑ C β β i i j j − S − S i , j i , j i j ∝ and so they are marginals

  25. Clique-tree & Clique-tree & queries queries What type of queries can we answer? marginals over subset of cliques P ( A ) A ⊆ C i

  26. Clique-tree & Clique-tree & queries queries What type of queries can we answer? marginals over subset of cliques P ( A ) A ⊆ C i updating the beliefs after new evidence ( t ) ( t ) P ( A ∣ E = ) A ⊆ , E ⊆ e C C i j multiply the (previously calibrated) beliefs ( t ) ( t ) ) I ( E β ( C = ) e i propagate to recalibrate (belief update procedure)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend