near optimal joint object matching via convex relaxation
play

Near-Optimal Joint Object Matching via Convex Relaxation Yuxin - PowerPoint PPT Presentation

October 22, 2014 Near-Optimal Joint Object Matching via Convex Relaxation Yuxin Chen, Stanford University Joint Work with Qixing Huang (TTIC), Leonidas Guibas (Stanford) Page 1 Assembling Fractured Pieces Computer Assembly (Fig. credit:


  1. October 22, 2014 Near-Optimal Joint Object Matching via Convex Relaxation Yuxin Chen, Stanford University Joint Work with Qixing Huang (TTIC), Leonidas Guibas (Stanford) Page 1

  2. Assembling Fractured Pieces Computer Assembly (Fig. credit: Huang et al 06) Manual Assembly (Ephesus, Turkey) Page 2

  3. Structure from Motion from Internet Images Page 3

  4. Data-Driven Shape Analysis Example: Joint Segmentation Page 4

  5. Joint Object/Graph Matching • Given : n objects (graphs), each containing a few elements (vertices) • Goal : consistently match all similar elements across all objects Page 5

  6. Naive Approach: Pairwise Matching • Naive Approach ◦ Compute pairwise matching across all pairs in isolation ◦ pairwise matching: extensively explored Page 6

  7. Are Pairwise Methods Perfect? Page 7

  8. Are Pairwise Methods Perfect? Page 7

  9. Additional Object Helps! Page 8

  10. Additional Object Helps! Page 9

  11. Popular Approach: 2-Stage Method • Stage 1: Pairwise Matching ◦ Compute pairwise matching across a few pairs in isolation ◦ Use off-the-shelf pairwise methods Page 10

  12. Popular Approach: 2-Stage Method • Stage 1: Pairwise Matching ◦ Compute pairwise matching across a few pairs in isolation ◦ Use off-the-shelf pairwise methods • Stage 2: Global Refinement ◦ Jointly refine all provided maps ◦ Criterion: exploit global consistency Page 10

  13. Object Representation • Object ◦ a set of points ◦ drawn from the same universe • Map ◦ point-to-point correspondence Page 11

  14. Problem Formulation • Input: a few pairwise matches computed in isolation Page 12

  15. Problem Formulation • Input: a few pairwise matches computed in isolation • Output: a collection of maps that are ◦ close to the input matches ◦ globally consistent • NP-Hard! [Huber 02] Page 12

  16. Prior Art spanning tree optimization detecting inconsistent spectral technique [Kim’12, [Huber’02] cycles [Zach’10, Ngu’11] Huang’12] • Pros : empirical success • Cons : ◦ little fundamental understanding (except [HuangGuibas’13]) ◦ rely on hyper-parameter tuning Page 13

  17. Advances in Fundamental Understanding • Semidefinite Relaxation (HuangGuibas’13) : ◦ theoretical guarantees under a basic setup ◦ tolerate 50% input errors Page 14

  18. Advances in Fundamental Understanding • Semidefinite Relaxation (HuangGuibas’13) : ◦ theoretical guarantees under a basic setup ◦ tolerate 50% input errors • Spectral Method (Pachauri et al’13) : ◦ recovery ability improves with # objects ◦ Gaussian-Wigner noise (not realistic though...) Page 14

  19. Advances in Fundamental Understanding • Semidefinite Relaxation (HuangGuibas’13) : ◦ theoretical guarantees under a basic setup ◦ tolerate 50% input errors • Spectral Method (Pachauri et al’13) : ◦ recovery ability improves with # objects ◦ Gaussian-Wigner noise (not realistic though...) • Several important challenges remain unaddressed... Page 14

  20. Advances in Fundamental Understanding • Semidefinite Relaxation (HuangGuibas’13) : ◦ theoretical guarantees under a basic setup ◦ tolerate 50% input errors • Spectral Method (Pachauri et al’13) : ◦ recovery ability improves with # objects ◦ Gaussian-Wigner noise (not realistic though...) • Several important challenges remain unaddressed... • Relevant problems: ◦ rotation sync (Wang et al), multiway alignment (Bandeira et al) Page 14

  21. Challenge 1: Dense Input Errors • Input Errors ◦ A significant fraction of inputs are corrupted Ground Truth Input Maps Page 15

  22. Challenge 1: Dense Input Errors • Input Errors ◦ A significant fraction of inputs are corrupted ◦ Prior art: — tolerate 50% input errors [HuangGuibas’2013] Ground Truth Input Maps Page 15

  23. Challenge 2: Partial Similarity • Partial Similarity ◦ Objects might only be partially similar to each other. — e.g. restricted views at different camera positions Input Maps Subgraph Matching Page 16

  24. Challenge 3: Incomplete Input • Partial Input Matches ◦ pairwise matching across all object pairs is — computationally expensive — sometimes inadmissible Page 17

  25. Our Goal • Develop an effective joint recovery method ◦ strong theoretical guarantee ( address the 3 challenges ) ◦ parameter free ◦ computationally feasible tolerate dense errors handle partial similarity fill in missing matches Page 18

  26. (Partial) Maps • One-to-one maps between (sub)-sets of elements ◦ subgraph matching / isomophism Page 19

  27. (Partial) Maps • One-to-one maps between (sub)-sets of elements ◦ subgraph matching / isomophism • Encode the maps across 2 objects by a 0-1 matrix   0 0 1 0 X 12 := 0 0 0 1   1 0 0 0 Page 19

  28. Matrix Representation   0 0 1 0 X 12 := 0 0 0 1   1 0 0 0 • Consider n objects Page 20

  29. Matrix Representation   0 0 1 0 X 12 := 0 0 0 1   1 0 0 0 • Consider n objects • Matrix representation for a collection of maps   · · · I X 12 X 1 n · · · X 21 I X 2 n   X =   . . . ... . . . . . .   · · · X n 1 X n 2 I ◦ Diagonal blocks: identity matrices (self-isomophism) ◦ Sparse Page 20

  30. Alternative Representation: Augmented Universe • All objects / sets are sub-sampled from the same universe (of size m ).   0 0 1 0 X 12 := 0 0 0 1   1 0 0 0 Page 21

  31. Alternative Representation: Augmented Universe • All objects / sets are sub-sampled from the same universe (of size m ).   0 0 1 0 X 12 := 0 0 0 1   1 0 0 0 • Map matrix Y i between object i and the universe   0 0 0 0 1   0 0 1 0 0 1 0 0 0 0   X 12 = Y 1 Y ⊤ Y 1 := Y 2 := ⇒ , 0 0 0 1 0     2 0 0 1 0 0   0 0 0 0 1 0 0 0 1 0 � �� � � �� � m columns m columns Page 21

  32. P.S.D. and Low-Rank Structure • Alternative Representation:     I X 12 · · · X 1 n Y 1 X 21 I · · · X 2 n Y 2     � � Y ⊤ Y ⊤ Y ⊤ X :=  = · · ·     . . . . ... . . . . 1 2 n . . . .    · · · X n 1 X n 2 I Y n � �� � m columns Page 22

  33. P.S.D. and Low-Rank Structure • Alternative Representation:     I X 12 · · · X 1 n Y 1 X 21 I · · · X 2 n Y 2     � � Y ⊤ Y ⊤ Y ⊤ X :=  = · · ·     . . . . ... . . . . 1 2 n . . . .    · · · X n 1 X n 2 I Y n � �� � m columns • positive semidefinite and low rank: rank ( X ) ≤ m. ◦ m : universe size Page 22

  34. Summary of Matrix Structure A consistent map matrix X 1. X � 0 = 2. low-rank 3. sparse (0-1 matrix) Y ⊤ Y X 4. X ii = I Page 23

  35. Summary of Matrix Structure A consistent map matrix X 1. X � 0 = 2. low-rank 3. sparse (0-1 matrix) Y ⊤ Y X 4. X ii = I Input map matrix X in • a noisy version of X ⇒ — input errors • missing entries — incomplete inputs input maps X in ground truth X Page 23

  36. Low Rank + Sparse Matrix Separation? ⇐ + additive errors: X in − X input maps: X in ground truth: X • Robust PCA / Matrix Completion? ◦ Candes et al ◦ Chandrasekahran et al minimize L , S � L � ∗ + � S � 1 , s.t. X in = + S L ⇓ (low rank) (sparse) estimate of X Page 24

  37. Outlier Component is Highly Biased ⇐ + additive errors: X in − X input maps: X in ground truth: X • Robust PCA can handle dense corruption if ◦ the sparse component exhibits random sign patterns Page 25

  38. Outlier Component is Highly Biased ⇐ + additive errors: X in − X input maps: X in ground truth: X • Robust PCA can handle dense corruption if ◦ the sparse component exhibits random sign patterns • Our Case? � 1 � � � · 1 X in − X m 1 · 1 ⊤ − X m 1 · 1 ⊤ − X = (1 − p true ) = p true X +(1 − p true ) E � �� � corruption rate � �� � highly biased spectral norm: (1 − p true ) n Page 25

  39. Debias the Error Components Original Form   Y 1 Y 2 � �   Y ⊤ Y ⊤ Y ⊤ X := � 0 · · ·  .  . 1 2 n .   Y n Augmented Form   1 ⊤ � � Y 1 1 ⊤   m :=  [ 1 Y ⊤ Y ⊤ n ] � 0  Y 2  · · · 1 X 1 .  . . Y n • Equivalently, X − 1 m 11 ⊤ � 0 � �� � debiasing Page 26

  40. Debias the Error Components Original Form   Y 1 Y 2 � �   Y ⊤ Y ⊤ Y ⊤ X := � 0 · · ·  .  . 1 2 n .   Y n Augmented Form   1 ⊤ � � Y 1 1 ⊤   m :=  [ 1 Y ⊤ Y ⊤ n ] � 0  Y 2  · · · 1 X 1 .  . . Y n • Equivalently, X − 1 m 11 ⊤ � 0 � �� � debiasing � m 11 ⊤ � X − 1 • rank = rank ( X ) − 1 ⇒ one more degree of freedom Page 26

  41. Objective Function X ≥ 0 , X � 0 • Ecourage consistency with provided maps � X , X in � ( to maximize ) Page 27

  42. Objective Function X ≥ 0 , X � 0 • Ecourage consistency with provided maps � X , X in � ( to maximize ) • Promote Sparsity � X � 1 = � X , 11 ⊤ � ( to minimize ) Page 27

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend