partial optimality by pruning for map inference with
play

Partial Optimality by Pruning for MAP-inference with General - PowerPoint PPT Presentation

Partial Optimality by Pruning for MAP-inference with General Graphical Models Paul Swoboda, Bogdan Savchynskyy, J org Kappes, Christoph Schn orr Heidelberg University, Germany 1/1 Segment the image... 2/1 Optimal labeling 3/1


  1. Partial Optimality by Pruning for MAP-inference with General Graphical Models Paul Swoboda, Bogdan Savchynskyy, J¨ org Kappes, Christoph Schn¨ orr Heidelberg University, Germany 1/1

  2. Segment the image... 2/1

  3. Optimal labeling 3/1

  4. Optimal labeling NP hard 3/1

  5. Solve convex relaxation (LP) Round relaxed solution NO optimality guarantees Optimal labeling NP hard 3/1

  6. Partial labeling Optimal labeling Polynomially solvable NP hard Optimality guaranteed 3/1

  7. Energy Minimization - MAP-Inference with Graphical Models ÿ arg min x P X J p x q : “ θ f p x ne p f q q f P F x i P t 1 , . . . , N u - variable Factor graph f P F Ă V n - factor G “ p V , F , E q - potential of x ne p f q P t 1 , . . . , N u | ne p f q| θ f p x ne p f q q 4/1

  8. Easy Examples Figure : Color segmentation [Lellman 2010] 5/1

  9. Difficult Examples (a) color segmentation (b) stereo (c) panorama stitching Figure : [Lellman 2010],[Szeliski et al. 2008],[Agarwala et al. 2004] 6/1

  10. Related Work Aux. problem higher order non-binary non-Potts Work Boros & Hammer 2002 ´ ´ ` QPBO Kovtun 2003 ` ´ ´ submodular Rother et al. 2007 ´ ´ ` QPBO Kohli et al. 2008 ` ´ ` QPBO Kovtun 2005 ` ´ ` submodular Fix et al. 2011 ´ ` ` QPBO Kahl & Strandmark 2012 ´ ` ` bi-submodular Windheuser et al. 2012 ` ` ` bi-submodular ` ´ ´ Swoboda et al. 2013 LP Shekhovtsov 2014 ` ´ ` LP Ours ` ` ` any relaxation 7/1

  11. Related Work Aux. problem higher order non-binary non-Potts Work Boros & Hammer 2002 ´ ´ ` QPBO Kovtun 2003 ` ´ ´ submodular Rother et al. 2007 ´ ´ ` QPBO Kohli et al. 2008 ` ´ ` QPBO Kovtun 2005 ` ´ ` submodular Fix et al. 2011 ´ ` ` QPBO Kahl & Strandmark 2012 ´ ` ` bi-submodular Windheuser et al. 2012 ` ` ` bi-submodular ` ´ ´ Swoboda et al. 2013 LP Shekhovtsov 2014 ` ´ ` LP Ours ` ` ` any relaxation 7/1

  12. Related Work Aux. problem higher order non-binary non-Potts Work Boros & Hammer 2002 ´ ´ ` QPBO Kovtun 2003 ` ´ ´ submodular Rother et al. 2007 ´ ´ ` QPBO Kohli et al. 2008 ` ´ ` QPBO Kovtun 2005 ` ´ ` submodular Fix et al. 2011 ´ ` ` QPBO Kahl & Strandmark 2012 ´ ` ` bi-submodular Windheuser et al. 2012 ` ` ` bi-submodular ` ´ ´ Swoboda et al. 2013 LP Shekhovtsov 2014 ` ´ ` LP Ours ` ` ` any relaxation 7/1

  13. Related Work Aux. problem higher order non-binary non-Potts Work Boros & Hammer 2002 ´ ´ ` QPBO Kovtun 2003 ` ´ ´ submodular Rother et al. 2007 ´ ´ ` QPBO Kohli et al. 2008 ` ´ ` QPBO Kovtun 2005 ` ´ ` submodular Fix et al. 2011 ´ ` ` QPBO Kahl & Strandmark 2012 ´ ` ` bi-submodular Windheuser et al. 2012 ` ` ` bi-submodular ` ´ ´ Swoboda et al. 2013 LP Shekhovtsov 2014 ` ´ ` LP Ours ` ` ` any relaxation 7/1

  14. Algorithm Outline Initialize: Generate labeling proposal repeat Verify the proposal on a current graph Shrink the graph until verification succeeds 8/1

  15. Algorithm Outline Initialize: Generate labeling proposal repeat Verify the proposal on a current graph Shrink the graph until verification succeeds 8/1

  16. Algorithm Outline Initialize: Generate labeling proposal repeat Verify the proposal on a current graph Shrink the graph until verification succeeds 8/1

  17. Algorithm Outline Initialize: Generate labeling proposal repeat Verify the proposal on a current graph Shrink the graph until verification succeeds 8/1

  18. Algorithm Proposed Partial Labeling ÿ ÿ J p x q “ θ v p x v q ` θ uv p x v , x u q v P V uv P E Labeling: x P X Partial optimality: χ P t 0 , 1 u | V | 9/1

  19. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E Labeling: x P X Partial optimality: χ P t 0 , 1 u | V | 9/1

  20. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E x “ arg min x P X J χ p x q Labeling: x P X ˆ NP-hard Ñ Relaxation Partial optimality: χ P t 0 , 1 u | V | 9/1

  21. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E x “ arg min x P X J χ p x q Labeling: x P X ˆ NP-hard Ñ Relaxation x i ‰ x i Ñ χ i “ 0 Partial optimality: χ P t 0 , 1 u | V | ˆ Shrinking Rule 9/1

  22. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E Labeling: x P X x “ arg min x P X J χ p x q ˆ NP-hard Ñ Relaxation x i ‰ x i Ñ χ i “ 0 ˆ Shrinking Rule Partial optimality: χ P t 0 , 1 u | V | 9/1

  23. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E Labeling: x P X x “ arg min x P X J χ p x q ˆ NP-hard Ñ Relaxation x i ‰ x i Ñ χ i “ 0 ˆ Shrinking Rule Partial optimality: χ P t 0 , 1 u | V | 9/1

  24. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E Labeling: x P X x “ arg min x P X J χ p x q ˆ NP-hard Ñ Relaxation x i ‰ x i Ñ χ i “ 0 ˆ Shrinking Rule Partial optimality: χ P t 0 , 1 u | V | 9/1

  25. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E Labeling: x P X x “ arg min x P X J χ p x q ˆ NP-hard Ñ Relaxation x i ‰ x i Ñ χ i “ 0 ˆ Shrinking Rule Partial optimality: χ P t 0 , 1 u | V | 9/1

  26. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E Labeling: x P X x “ arg min x P X J χ p x q ˆ NP-hard Ñ Relaxation x i ‰ x i Ñ χ i “ 0 ˆ Shrinking Rule Partial optimality: χ P t 0 , 1 u | V | 9/1

  27. Algorithm Proposed Partial Labeling Perturbed Problem ÿ ÿ ÿ ¯ ÿ ÿ J χ p x q “ θ v p x v q` θ uv p x v , x u q` θ v p x v q J p x q “ θ v p x v q ` θ uv p x v , x u q v P V Y V uv P E v P V v P V uv P E Labeling: x P X x “ arg min x P X J χ p x q ˆ NP-hard Ñ Relaxation x i ‰ x i Ñ χ i “ 0 ˆ Shrinking Rule Partial optimality: χ P t 0 , 1 u | V | 9/1

  28. Use of Approximate Solvers Do we need to solve the relaxed problem x “ arg min x P X J χ p x q ˆ exactly? NO! Approximate solvers with optimality certificate (like TRW-S [Kolmogorov 2005]) are allowed here 10/1

  29. Results Experiment (N) MQPBO Kovtun GRD Fix Ours 0 : : : 0.4423 teddy 0 : : : 0.0009 venus 0.0432 : : : 0.0611 family 0.1247 : : : 0.5680 pano : : Potts (12) 0.1839 0.7475 0.9231 0.0247 : : : 0.6513 side-chain (21) protein (8) : : 0.2603 0.2545 0.7799 : : : 0.1771 0.9992 cell-tracking : : : : 0.8407 geo-surf (50) Table : Percentage of persistent variables; : - method inapplicable. We used local polytope relaxation and TRW-S and CPLEX as solvers. Benchmarks: [Szeliski et al. 2008],[Kappes et al. 2013], [PIC 2011] 11/1

  30. Results Experiment (N) MQPBO Kovtun GRD Fix Ours 0 : : : 0.4423 teddy 0 : : : 0.0009 venus 0.0432 : : : 0.0611 family 0.1247 : : : 0.5680 pano : : Potts (12) 0.1839 0.7475 0.9231 0.0247 : : : 0.6513 side-chain (21) protein (8) : : 0.2603 0.2545 0.7799 : : : 0.1771 0.9992 cell-tracking : : : : 0.8407 geo-surf (50) Table : Percentage of persistent variables; : - method inapplicable. We used local polytope relaxation and TRW-S and CPLEX as solvers. Benchmarks: [Szeliski et al. 2008],[Kappes et al. 2013], [PIC 2011] 11/1

  31. Potts models: Results Ours Kovtun’s method 12/1

  32. Potts models: Results Ours Kovtun’s method 13/1

  33. Potts models: Results Ours Kovtun’s method 14/1

  34. Potts models: Results Ours Kovtun’s method 15/1

  35. Potts models: Results Ours Kovtun’s method 16/1

  36. Take Home Message and Outlook We presented: A generic method for partial optimality from MAP-Inference, which can employ any relaxation can use certain approximate solvers in the loop (e.g. TRW-S) scales as well as the used MAP-inference solver 17/1

  37. Take Home Message and Outlook We presented: A generic method for partial optimality from MAP-Inference, which can employ any relaxation can use certain approximate solvers in the loop (e.g. TRW-S) scales as well as the used MAP-inference solver Code preliminary research code at http://paulswoboda.net revised code will be included to OpenGM library soon. 17/1

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend