feedback message passing for inference in gaussian
play

Feedback Message Passing for Inference in Gaussian Graphical Models - PowerPoint PPT Presentation

Feedback Message Passing for Inference in Gaussian Graphical Models Ying Liu Venkat Chandrasekaran, Animashree Anandkumar and Alan Willsky Stochastic Systems Group, Laboratory for Information and Decision Systems, Massachusetts Institute of


  1. Feedback Message Passing for Inference in Gaussian Graphical Models Ying Liu Venkat Chandrasekaran, Animashree Anandkumar and Alan Willsky Stochastic Systems Group, Laboratory for Information and Decision Systems, Massachusetts Institute of Technology ISIT, Austin Texas, June 18, 2010 1/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 1 / 21

  2. Gaussian Graphical Models 2/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 2 / 21

  3. Gaussian Graphical Models The probability density of a Gaussian graphical model can be written as p ( x ) ∝ exp {− 1 2 x T J x + h T x } where J is called the information matrix and h is called the potential vector. 2/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 2 / 21

  4. Gaussian Graphical Models The probability density of a Gaussian graphical model can be written as p ( x ) ∝ exp {− 1 2 x T J x + h T x } where J is called the information matrix and h is called the potential vector. For a valid model, J is symmetric and positive semidefinite. 2/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 2 / 21

  5. Gaussian Graphical Models The probability density of a Gaussian graphical model can be written as p ( x ) ∝ exp {− 1 2 x T J x + h T x } where J is called the information matrix and h is called the potential vector. For a valid model, J is symmetric and positive semidefinite. An information matrix J is sparse or Markov with respect to a graph if G = {V , E} : ∀ ( i, j ) / ∈ E , J ij = 0 . Matrix Structure Graph Structure 2/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 2 / 21

  6. Inference Problem and Applications p ( x ) ∝ exp {− 1 2 x T J x + h T x } 3/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 3 / 21

  7. Inference Problem and Applications p ( x ) ∝ exp {− 1 2 x T J x + h T x } means µ = J − 1 h and variances diag { Σ = J − 1 } 3/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 3 / 21

  8. Inference Problem and Applications p ( x ) ∝ exp {− 1 2 x T J x + h T x } means µ = J − 1 h and variances diag { Σ = J − 1 } Solving this problem in general has O ( n 3 ) (fastest O ( n 2 . 376 ) ) time complexity, which is intractable for very large scale models. 3/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 3 / 21

  9. Inference Problem and Applications p ( x ) ∝ exp {− 1 2 x T J x + h T x } means µ = J − 1 h and variances diag { Σ = J − 1 } Solving this problem in general has O ( n 3 ) (fastest O ( n 2 . 376 ) ) time complexity, which is intractable for very large scale models. Applications: gene regulatory networks, medical diagnostics, oceanography, and communication systems 3/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 3 / 21

  10. Related Work 4/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 4 / 21

  11. Related Work Belief propagation on trees: linear time complexity, exactness 4/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 4 / 21

  12. Related Work Belief propagation on trees: linear time complexity, exactness Loopy belief propagation for graphs with cycles. 4/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 4 / 21

  13. Related Work Belief propagation on trees: linear time complexity, exactness Loopy belief propagation for graphs with cycles. ◮ LBP performs reasonably well for certain loopy graphs (Murphy et al, Crick et al). 4/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 4 / 21

  14. Related Work Belief propagation on trees: linear time complexity, exactness Loopy belief propagation for graphs with cycles. ◮ LBP performs reasonably well for certain loopy graphs (Murphy et al, Crick et al). ◮ Convergence and accuracy not guaranteed in general (Ihler et al, Weiss et al) 4/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 4 / 21

  15. Related Work Belief propagation on trees: linear time complexity, exactness Loopy belief propagation for graphs with cycles. ◮ LBP performs reasonably well for certain loopy graphs (Murphy et al, Crick et al). ◮ Convergence and accuracy not guaranteed in general (Ihler et al, Weiss et al) ◮ For Gaussian graphical models, if LBP converges, the means are correct and the variances are generally incorrect (Weiss et al) 4/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 4 / 21

  16. Related Work Belief propagation on trees: linear time complexity, exactness Loopy belief propagation for graphs with cycles. ◮ LBP performs reasonably well for certain loopy graphs (Murphy et al, Crick et al). ◮ Convergence and accuracy not guaranteed in general (Ihler et al, Weiss et al) ◮ For Gaussian graphical models, if LBP converges, the means are correct and the variances are generally incorrect (Weiss et al) ◮ Walk-sum analysis framework (Malioutov et al) 4/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 4 / 21

  17. Related Work Belief propagation on trees: linear time complexity, exactness Loopy belief propagation for graphs with cycles. ◮ LBP performs reasonably well for certain loopy graphs (Murphy et al, Crick et al). ◮ Convergence and accuracy not guaranteed in general (Ihler et al, Weiss et al) ◮ For Gaussian graphical models, if LBP converges, the means are correct and the variances are generally incorrect (Weiss et al) ◮ Walk-sum analysis framework (Malioutov et al) Generalized BP (Yedidia et al.), embedded trees (Sudderth et al.), inference by tractable subgraphs (Chandrasekaran et al.) 4/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 4 / 21

  18. Main Results 5/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 5 / 21

  19. Main Results Exact Feedback Message Passing Exact Solution: O ( k 2 n ) , where k is the size of the “feedback nodes” and n is the number of nodes. 5/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 5 / 21

  20. Main Results Exact Feedback Message Passing Exact Solution: O ( k 2 n ) , where k is the size of the “feedback nodes” and n is the number of nodes. Approximate Feedback Message Passing Approximate Solution: trade-off complexity and accuracy by selecting a proper set of “feedback nodes” of bounded size. Walk-sum interpretation. 5/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 5 / 21

  21. Main Results Exact Feedback Message Passing Exact Solution: O ( k 2 n ) , where k is the size of the “feedback nodes” and n is the number of nodes. Approximate Feedback Message Passing Approximate Solution: trade-off complexity and accuracy by selecting a proper set of “feedback nodes” of bounded size. Walk-sum interpretation. High level idea: run common BP/LBP on non-feedback nodes; special message passing scheme for feedback nodes. 5/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 5 / 21

  22. Main Results Exact Feedback Message Passing Exact Solution: O ( k 2 n ) , where k is the size of the “feedback nodes” and n is the number of nodes. Approximate Feedback Message Passing Approximate Solution: trade-off complexity and accuracy by selecting a proper set of “feedback nodes” of bounded size. Walk-sum interpretation. High level idea: run common BP/LBP on non-feedback nodes; special message passing scheme for feedback nodes. Obtain inference results for feedback nodes first. 1 5/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 5 / 21

  23. Main Results Exact Feedback Message Passing Exact Solution: O ( k 2 n ) , where k is the size of the “feedback nodes” and n is the number of nodes. Approximate Feedback Message Passing Approximate Solution: trade-off complexity and accuracy by selecting a proper set of “feedback nodes” of bounded size. Walk-sum interpretation. High level idea: run common BP/LBP on non-feedback nodes; special message passing scheme for feedback nodes. Obtain inference results for feedback nodes first. 1 Make corrections for the non-feedback nodes afterward. 2 5/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 5 / 21

  24. Gaussian Belief Propagation 1 Message Passing ∀ j ∈ N ( i ) , ∆ J i → j ∆ h i → j 6/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 6 / 21

  25. Gaussian Belief Propagation 1 Message Passing ∀ j ∈ N ( i ) , ∆ J i → j ∆ h i → j 2 Marginal Computation � � ˆ ˆ ∀ i ∈ V , J i = J ii + ∆ J k → i h i = h i + ∆ h k → i k ∈N ( i ) k ∈N ( i ) µ i = ˆ J − 1 ˆ Var { i } = ˆ J − 1 h i i i . 6/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 6 / 21

  26. Loopy Belief Propagation Message update scheme: completely local, no header information and suffers from the cyclic effects. 7/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 7 / 21

  27. Loopy Belief Propagation Message update scheme: completely local, no header information and suffers from the cyclic effects. More memory and multiple messages? 7/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 7 / 21

  28. Loopy Belief Propagation Message update scheme: completely local, no header information and suffers from the cyclic effects. More memory and multiple messages? Sacrifice some distributiveness for better convergence and accuracy? 7/21 Ying Liu (LIDS, MIT) Feedback Message Passing ISIT 2010 7 / 21

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend