map estimation with perfect graphs
play

MAP Estimation with Perfect Graphs Tony Jebara July 21, 2009 - PowerPoint PPT Presentation

Background Matchings Perfect Graphs MAP Estimation MAP Estimation with Perfect Graphs Tony Jebara July 21, 2009 Background Matchings Perfect Graphs MAP Estimation Background 1 Perfect Graphs Graphical Models Matchings 2 Bipartite


  1. Background Matchings Perfect Graphs MAP Estimation MAP Estimation with Perfect Graphs Tony Jebara July 21, 2009

  2. Background Matchings Perfect Graphs MAP Estimation Background 1 Perfect Graphs Graphical Models Matchings 2 Bipartite Matching Generalized Matching Perfect Graphs 3 nand Markov Random Fields Packing Linear Programs Recognizing Perfect Graphs MAP Estimation 4 Proving Exact MAP Pruning NMRFs MAP Experiments Conclusions

  3. Background Matchings Perfect Graphs MAP Estimation Perfect Graphs Background on Perfect Graphs In 1960, Claude Berge introduces perfect graphs and two conjectures Perfect iff every induced subgraph has clique # = coloring # Weak conjecture: G is perfect iff its complement is perfect Strong conjecture: all perfect graphs are Berge graphs Weak perfect graph theorem (Lov´ asz 1972) Link between perfection and integral LPs (Lov´ asz 1972) Strong perfect graph theorem (SPGT) open for 4+ decades

  4. Background Matchings Perfect Graphs MAP Estimation Perfect Graphs Background on Perfect Graphs SPGT Proof (Chudnovsky, Robertson, Seymour, Thomas 2003) Berge passes away shortly after hearing of the proof Many NP-hard and hard to approximate problems are P for perfect graphs Graph coloring Maximum clique Maximum independent set Recognizing perfect graphs is O ( n 9 ) (Chudnovsky et al. 2006)

  5. Background Matchings Perfect Graphs MAP Estimation Graphical Models Graphical Models x 1 x 2 x 3 x 4 x 5 x 6 Perfect graph theory for MAP and graphical models (J 2009) Graphical model: a graph G = ( V , E ) representing a distribution p ( X ) where X = { x 1 , . . . , x n } and x i ∈ Z Distribution factorizes over graph cliques 1 � p ( x 1 , . . . , x n ) = ψ c ( X c ) Z c ∈ C E.g. p ( x 1 , . . . , x 6 )= ψ ( x 1 , x 2 ) ψ ( x 2 , x 3 ) ψ ( x 3 , x 4 , x 5 ) ψ ( x 4 , x 5 , x 6 )

  6. Background Matchings Perfect Graphs MAP Estimation Graphical Models Canonical Problems for Graphical Models Given a factorized distribution 1 � p ( x 1 , . . . , x n ) = ψ c ( X c ) Z c ∈ C Inference: recover various marginals like p ( x i ) or p ( x i , x j ) � � � � p ( x i ) = · · · · · · p ( x 1 , . . . , x n ) x 1 x i − 1 x i +1 x n Estimation: find most likely configurations x ∗ i or ( x ∗ 1 , . . . , x ∗ n ) x ∗ = arg max max x 1 · · · max x i − 1 max x i +1 · · · max x n p ( x 1 , . . . , x n ) i x i In general both are NP-hard, but for chains and trees just P

  7. Background Matchings Perfect Graphs MAP Estimation Graphical Models Example for Chain Given a chain-factorized distribution 1 p ( x 1 , . . . , x 5 ) = Z ψ ( x 1 , x 2 ) ψ ( x 2 , x 3 ) ψ ( x 3 , x 4 ) ψ ( x 4 , x 5 ) Inference: recover various marginals like p ( x i ) or p ( x i , x j ) � � � � p ( x 5 ) ∝ ψ ( x 1 , x 2 ) ψ ( x 2 , x 3 ) ψ ( x 3 , x 4 ) ψ ( x 4 , x 5 ) x 1 x 2 x 3 x 4 Estimation: find most likely configurations x ∗ i or ( x ∗ 1 , . . . , x ∗ n ) x ∗ 5 = arg max x 5 max x 1 max x 2 ψ ( x 1 , x 2 ) max x 3 ψ ( x 2 , x 3 ) max x 4 ψ ( x 3 , x 4 ) ψ ( x 4 , x 5 ) The work distributes and becomes efficient!

  8. Background Matchings Perfect Graphs MAP Estimation Graphical Models Canonical Problems on Trees The idea of distributed computation extends nicely to trees On trees (which subsume chains) do a collect/distribute step Alternatively, can perform distributed updates asynchronously Each step is a sum-product or a max-product update

  9. Background Matchings Perfect Graphs MAP Estimation Graphical Models Canonical Problems on Trees The idea of distributed computation extends nicely to trees On trees (which subsume chains) do a collect/distribute step Alternatively, can perform distributed updates asynchronously Each step is a sum-product or a max-product update

  10. Background Matchings Perfect Graphs MAP Estimation Graphical Models MAP Estimation Let’s focus on finding most probable configurations efficiently X ∗ = arg max p ( x 1 , . . . , x n ) Useful for image processing, protein folding, coding, etc. Brute force requires � n i =1 | x i | Efficient for trees and singly linked graphs (Pearl 1988) NP-hard for general graphs (Shimony 1994) Approach A: relaxations and variational methods First order LP relaxations (Wainwright et al. 2002) TRW max-product (Kolmogorov & Wainwright 2006) Higher order LP relaxations (Sontag et al. 2008) Fractional and integral LP rounding (Ravikumar et al. 2008) Open problem: when are LPs tight? Approach B: loopy max product and message passing

  11. Background Matchings Perfect Graphs MAP Estimation Graphical Models Max Product Message Passing m t +1 d ∈ Ne( i ) \ c m t 1. For each x i to each X c : i → c = � d → i m t +1 j ∈ c \ i m t 2. For each X c to each x i : c → i = max X c \ x i ψ c ( X c ) � j → c 3. Set t = t + 1 and goto 1 until convergence 4. Output x ∗ d ∈ Ne( i ) m t � i = arg max x i d → i Simple and fast algorithm for MAP Exact for trees (Pearl 1988) Exact for single-loop graphs (Weiss & Freeman 2001) Local optimality guarantees (Wainwright et al. 2003) Performs well in practice for images, turbo codes, etc. Similar to first order LP relaxation Recent progress Exact for matchings (Bayati et al. 2005) Exact for generalized b matchings (Huang and J 2007)

  12. Background Matchings Perfect Graphs MAP Estimation Bipartite Matching Motorola Apple IBM   0 1 0 ” laptop ” 0$ 2$ 2$ → C = 0 0 1   ” server ” 0$ 2$ 3$ 1 0 0 ” phone ” 2$ 3$ 0$ Given W , max C ∈ B n × n � ij W ij C ij such that � i C ij = � j C ij = 1 Classical Hungarian marriage problem O ( n 3 ) Creates a very loopy graphical model Max product takes O ( n 3 ) for exact MAP (Bayati et al. 2005)

  13. Background Matchings Perfect Graphs MAP Estimation Bipartite Matching Bipartite Generalized Matching Motorola Apple IBM   0 1 1 ” laptop ” 0$ 2$ 2$ → C = 1 0 1   ” server ” 0$ 2$ 3$ 1 1 0 ” phone ” 2$ 3$ 0$ Given W , max C ∈ B n × n � ij W ij C ij such that � i C ij = � j C ij = b Combinatorial b -matching problem O ( bn 3 ), (Google Adwords) Creates a very loopy graphical model Max product takes O ( bn 3 ) for exact MAP (Huang & J 2007)

  14. Background Matchings Perfect Graphs MAP Estimation Bipartite Matching Bipartite Generalized Matching u 1 u 2 u 3 u 4 v 1 v 2 v 3 v 4 Graph G = ( U , V , E ) with U = { u 1 , . . . , u n } and V = { v 1 , . . . , v n } and M ( . ), a set of neighbors of node u i or v j Define x i ∈ X and y i ∈ Y where x i = M ( u i ) and y i = M ( v j ) Then p ( X , Y ) = 1 � � j ψ ( x i , y j ) � k φ ( x k ) φ ( y k ) where i Z φ ( y j ) = exp( � u i ∈ y j W ij ) and ψ ( x i , y j ) = ¬ ( v j ∈ x i ⊕ u i ∈ y j )

  15. Background Matchings Perfect Graphs MAP Estimation Bipartite Matching Bipartite Generalized Matching Theorem (Huang & J 2007) Max product on G converges in O ( bn 3 ) time. Proof. Form unwrapped tree T of depth Ω( n ), maximizing belief at root of T is equivalent to maximizing belief at corresponding node in G u 1 v 1 v 2 v 3 v 4 u 2 u 3 u 4 u 2 u 3 u 4 u 2 u 3 u 4 u 2 u 3 u 4 Theorem (Salez & Shah 2009) Under mild assumptions, max product 1 -matching is O ( n 2 ) .

  16. Background Matchings Perfect Graphs MAP Estimation Bipartite Matching Bipartite Generalized Matching Code at http://www.cs.columbia.edu/ ∼ jebara/code

  17. Background Matchings Perfect Graphs MAP Estimation Generalized Matching Generalized Matching BP median running time Median Running time when B=5 3 BP GOBLIN Applications: 0.15 2 0.1 t 1/3 unipartite matching t 1 0.05 40 0 clustering (J & S 2006) 20 100 0 50 b 20 40 60 80 100 n n classification (H & J 2007) GOBLIN median running time Median Running time when B=  n/2  4 BP collaborative filtering (H & J 2008) GOBLIN 3 150 semisupervised (J et al. 2009) t 1/4 2 100 t 50 1 visualization (S & J 2009) 40 0 20 100 0 50 b 20 40 60 80 100 n n Max product is O ( n 2 ) on dense graphs (Salez & Shah 2009) Much faster than other solvers

  18. Background Matchings Perfect Graphs MAP Estimation Generalized Matching Unipartite Generalized Matching Above is k -nearest neighbors with k = 2

  19. Background Matchings Perfect Graphs MAP Estimation Generalized Matching Unipartite Generalized Matching Above is unipartite b -matching with b = 2

  20. Background Matchings Perfect Graphs MAP Estimation Generalized Matching Unipartite Generalized Matching Left is k -nearest neighbors, right is unipartite b -matching.

  21. Background Matchings Perfect Graphs MAP Estimation Generalized Matching Unipartite Generalized Matching p 1 p 2 p 3 p 4  0 1 0 1  p 1 0 2 1 2 1 0 1 0   P 2 2 0 2 1 → C =   0 1 0 1   p 3 1 2 0 2 1 0 1 0 p 4 2 1 2 0 max C ∈ B n × n , C ii =0 � ij W ij C ij such that � i C ij = b , C ij = C ji Combinatorial unipartite matching is efficient (Edmonds 1965) Makes an LP with exponentially many blossom inequalities Max product exact if LP is integral (Sanghavi et al. 2008) �� � � p ( X ) = � i ∈ V δ j ∈ Ne( i ) x ij ≤ 1 ij ∈ E exp( W ij x ij )

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend