Curve and surface reconstruction Steve Oudot Reconstruction - - PowerPoint PPT Presentation

curve and surface reconstruction
SMART_READER_LITE
LIVE PREVIEW

Curve and surface reconstruction Steve Oudot Reconstruction - - PowerPoint PPT Presentation

INF562 G eom etrie Algorithmique et Applications Curve and surface reconstruction Steve Oudot Reconstruction Paradigm Q What do you see? Why? Reconstruction Paradigm Q What do you see? Why? http://jolicoloriage.free.fr


slide-1
SLIDE 1

INF562 – G´ eom´ etrie Algorithmique et Applications

Curve and surface reconstruction

Steve Oudot

slide-2
SLIDE 2

Reconstruction Paradigm

Q What do you see? Why?

slide-3
SLIDE 3

Reconstruction Paradigm

Q What do you see? Why?

http://jolicoloriage.free.fr

slide-4
SLIDE 4

Reconstruction Paradigm

Q What do you see? Why?

http://jolicoloriage.free.fr

slide-5
SLIDE 5

Reconstruction Paradigm

Q What do you see? Why?

http://jolicoloriage.free.fr

without the numbers...

slide-6
SLIDE 6

Reconstruction Paradigm (Cont’d)

slide-7
SLIDE 7

Q Given a point cloud, build a faithful (implicit, PL, ...) approximation of the shape underlying the data. Reconstruction Paradigm (Cont’d)

slide-8
SLIDE 8

Reconstruction problem is ill-posed by nature.

Reconstruction Paradigm (Cont’d)

slide-9
SLIDE 9

in this example we assume that the underlying shape has positive reach

Reconstruction problem is ill-posed by nature. → make assumptions on the underlying shape, e.g.: fix dimension, topo- logical type, regularity (differentiability), Hausdorff distance to input...

Reconstruction Paradigm (Cont’d)

slide-10
SLIDE 10

in this example we assume that the underlying shape has positive reach

Reconstruction problem is ill-posed by nature. → make assumptions on the underlying shape, e.g.: fix dimension, topo- logical type, regularity (differentiability), Hausdorff distance to input...

Reconstruction Paradigm (Cont’d)

slide-11
SLIDE 11

Reconstruction problem is ill-posed by nature. → make assumptions on the underlying shape, e.g.: fix dimension, topo- logical type, regularity (differentiability), Hausdorff distance to input... → for a suitable choice of hypotheses, the solution becomes unique up to a set of local regular deformations (solution never unique!)

Reconstruction Paradigm (Cont’d)

slide-12
SLIDE 12

transition : le probleme de la reconstruction a donc une longue histoire et a ete etudie tout a long du developpement de nouvelles techniques d’acquistion. Toutefois, il est loin d’etre un probleme isole et se rapproche en fait naturellement d’une classe de problemes en apprentissage, dont il est en quelque sorte une version plus aboutie

Other (weaker) forms of reconstruction

slide-13
SLIDE 13

transition : le probleme de la reconstruction a donc une longue histoire et a ete etudie tout a long du developpement de nouvelles techniques d’acquistion. Toutefois, il est loin d’etre un probleme isole et se rapproche en fait naturellement d’une classe de problemes en apprentissage, dont il est en quelque sorte une version plus aboutie

Other (weaker) forms of reconstruction clustering

slide-14
SLIDE 14

transition : le probleme de la reconstruction a donc une longue histoire et a ete etudie tout a long du developpement de nouvelles techniques d’acquistion. Toutefois, il est loin d’etre un probleme isole et se rapproche en fait naturellement d’une classe de problemes en apprentissage, dont il est en quelque sorte une version plus aboutie

Other (weaker) forms of reconstruction clustering

topological inference

slide-15
SLIDE 15

transition : le probleme de la reconstruction a donc une longue histoire et a ete etudie tout a long du developpement de nouvelles techniques d’acquistion. Toutefois, il est loin d’etre un probleme isole et se rapproche en fait naturellement d’une classe de problemes en apprentissage, dont il est en quelque sorte une version plus aboutie

Other (weaker) forms of reconstruction clustering

topological inference reconstruction

slide-16
SLIDE 16

Reverse engineering LASER Sources stereo vision mechanical sensor Applications

3D scans

Prototyping Quality control

Stanford Michelangelo Project

Where do the data come from?

slide-17
SLIDE 17

Diagnostic MRI scan Sources echograph . . . Applications

Medical Imaging

Endoscopy simulation Chirurgical intervention planning

Where does the data come from?

slide-18
SLIDE 18

Maps making / Terrain modeling satellite/aerial images Sources ground probing seismograph Applications

Geography, Geology

Prospection (tunnels, oil)

Where does the data come from?

slide-19
SLIDE 19

Data bases Sources Simulations Applications

Higher-Dimensions

Path planning Pattern recognition Image processing ... Machine Learning

S7 Where does the data come from?

slide-20
SLIDE 20

Various reconstruction techniques Delaunay-based

  • Crust / Power Crust
  • Cocone
  • Gabriel / α-shape / β-skeleton
  • flow complex

Implicitization

  • Local polynomial fitting
  • Natural Neighbors (Voronoi-based)
  • Radial Basis Functions

Projection operators

  • Moving Least Squares
  • Extremal surfaces

For arbitrary dimensions and co-dimensions

  • Unions of balls / nerves
  • Witness Complex
slide-21
SLIDE 21

What Delaunay has to do with reconstruction

slide-22
SLIDE 22

What Delaunay has to do with reconstruction

→ a faithful approximation of the curve appears as a subcomplex of the Delaunay → this should hold whenever the point cloud is sufficiently densely sampled along the curve

slide-23
SLIDE 23

What Delaunay has to do with reconstruction

→ a faithful approximation of the curve appears as a subcomplex of the Delaunay → this should hold whenever the point cloud is sufficiently densely sampled along the curve Q What is this good subcomplex? Can it be defined in some canonical way?

slide-24
SLIDE 24

Restricted Delaunay triangulation

slide-25
SLIDE 25

Restricted Delaunay triangulation

slide-26
SLIDE 26

Restricted Delaunay triangulation

Def: DelS(P) := {σ ∈ Del(P) | σ∗ ∩ S = ∅}

slide-27
SLIDE 27

→ Our assumptions:

Approximation power of the restricted Delaunay

  • 1. the underlying shape S is a closed curve or surface with positive reach ̺S
  • 2. the point cloud P is an ε-sample of S with ε ∈ O(̺S).
slide-28
SLIDE 28

→ Our assumptions:

Approximation power of the restricted Delaunay

  • 1. the underlying shape S is a closed curve or surface with positive reach ̺S
  • 2. the point cloud P is an ε-sample of S with ε ∈ O(̺S).

1’. the underlying signal is a weighted sum of sinusoids 2’. the sampling has ≥ 2 samples per period (signal has bounded bandwidth) → analogy with 1-d signal theory (Shannon’s reconstruction theorem):

slide-29
SLIDE 29

Approximation power of the restricted Delaunay

Theorem: [Amenta et al. 1998-99] If S is a curve or surface with positive reach, and if P is an ε-sample of S with ε < ̺S (curve) or ε < 0.1̺S (surface), then:

  • DelS(P) is homeomorphic to S,
  • dH(DelS(P), S) ∈ O(ε2),
  • ∀f ∈ DelS(P), ∀v ∈ f, ∠nfnvS ∈ O(ε),
  • · · · (similar areas, curvature estimation, etc.)
slide-30
SLIDE 30

Approximation power of the restricted Delaunay

Theorem: [Amenta et al. 1998-99] If S is a curve or surface with positive reach, and if P is an ε-sample of S with ε < ̺S (curve) or ε < 0.1̺S (surface), then:

  • DelS(P) is homeomorphic to S,
  • dH(DelS(P), S) ∈ O(ε2),
  • ∀f ∈ DelS(P), ∀v ∈ f, ∠nfnvS ∈ O(ε),
  • · · · (similar areas, curvature estimation, etc.)

→ to be explicited: ε-sampling, reach

slide-31
SLIDE 31

ε-samples

ε Def: P is an ε-sample of S if ∀x ∈ S, min{x − p | p ∈ P} ≤ ε. S

slide-32
SLIDE 32

Shapes with positive reach [Federer 1958]

S MS Def: MS is the closure of the set of points of Rd that have ≥ 2 nearest neighbors on S.

slide-33
SLIDE 33

Shapes with positive reach [Federer 1958]

S MS Def: ∀x ∈ S, lfs(x) = min{x − m | m ∈ MS} Def: MS is the closure of the set of points of Rd that have ≥ 2 nearest neighbors on S.

slide-34
SLIDE 34

Shapes with positive reach [Federer 1958]

̺S

S MS Del: ̺S = min{d(x, MS) | x ∈ S} Def: ∀x ∈ S, lfs(x) = min{x − m | m ∈ MS} Def: MS is the closure of the set of points of Rd that have ≥ 2 nearest neighbors on S.

slide-35
SLIDE 35

Shapes with positive reach [Federer 1958]

O r x → x3 sin 1

x

slide-36
SLIDE 36

Shapes with positive reach [Federer 1958]

O r ̺S = +∞ ̺S = r ̺S = 0 x → x3 sin 1

x

(convex) C1,1 but not C2) (C1 but not C1,1)

slide-37
SLIDE 37

Insist on the fact that these properties allow to avoid using arguments from smooth analysis while making the reasoning hold for a larger class of shape. Mention also the fact that we only need two of these fundamental results for the analysis of curves.

Shapes with positive reach (Cont’d)

→ Fundamental properties: (see [Federer 1958])

Tangent Ball Lemma: ∀x ∈ S, ∀c ∈ nxS, x − c < lfs(x) ⇒ B(c, x − c) ∩ S = ∅. S MS x

lfs(x)

slide-38
SLIDE 38

Insist on the fact that these properties allow to avoid using arguments from smooth analysis while making the reasoning hold for a larger class of shape. Mention also the fact that we only need two of these fundamental results for the analysis of curves.

Shapes with positive reach (Cont’d)

→ Fundamental properties: (see [Federer 1958])

Tangent Ball Lemma: ∀x ∈ S, ∀c ∈ nxS, x − c < lfs(x) ⇒ B(c, x − c) ∩ S = ∅. S Topological Ball Lemma: If S is a k-manifold, then ∀B(c, r) s.t. B(c, r) ∩ MS = ∅, B(c, r) ∩ S is either empty or a topological k-ball. c

slide-39
SLIDE 39

Approximation power of the restricted Delaunay

Theorem: [Amenta et al. 1998-99] If S is a curve or surface with positive reach, and if P is an ε-sample of S with ε < ̺S (curve) or ε < 0.1̺S (surface), then:

  • DelS(P) is homeomorphic to S,
  • dH(DelS(P), S) ∈ O(ε2),
  • ∀f ∈ DelS(P), ∀v ∈ f, ∠nfnvS ∈ O(ε),
  • · · · (similar areas, curvature estimation, etc.)
slide-40
SLIDE 40

Approximation power of the restricted Delaunay

Proof for curves: → show that every edge of DelS(P) connects consecutive points of P along S, and vice-versa

slide-41
SLIDE 41

Approximation power of the restricted Delaunay

Proof for curves: → show that every edge of DelS(P) connects consecutive points of P along S, and vice-versa c S p q Let c ∈ pq∗ ∩ S. r = c − p = c − q = d(c, P) ≤ ε < ̺S ≤ lfs(c) ⇒ B(c, r) ∩ S is a topological arc

slide-42
SLIDE 42

Approximation power of the restricted Delaunay

Proof for curves: → show that every edge of DelS(P) connects consecutive points of P along S, and vice-versa c S p q Let c ∈ pq∗ ∩ S. r = c − p = c − q = d(c, P) ≤ ε < ̺S ≤ lfs(c) ⇒ B(c, r) ∩ S is a topological arc s if s ∈ P \ {p, q} belongs to this arc, then the arc is tangent to ∂B(c, r) in p, q or s (say s) ⇒ d(c, P) = r = c − s ≥ lfs(s) > ε. (contradiction with the hypothesis of the theorem)

slide-43
SLIDE 43

Approximation power of the restricted Delaunay

Proof for curves: → show that every edge of DelS(P) connects consecutive points of P along S, and vice-versa c S p q Let c ∈ arcS(pq) ∩ ∂p∗. c ∈ ps∗ for some s ∈ P \ {p}

slide-44
SLIDE 44

Approximation power of the restricted Delaunay

Proof for curves: → show that every edge of DelS(P) connects consecutive points of P along S, and vice-versa S p q Let c ∈ arcS(pq) ∩ ∂p∗. s c S c ∈ ps∗ for some s ∈ P \ {p} ⇒ p, s consecutive along S, with c ∈ arcS(ps) ⇒ s = q ⇒ ps ∈ DelS(P)

(by previous part of the proof)

slide-45
SLIDE 45

Approximation power of the restricted Delaunay

Proof for curves: → show that every edge of DelS(P) connects consecutive points of P along S, and vice-versa ⇒ DelS(P) is homeomorphic to S between each pair of consecutive points of P Since DelS(P) is embedded in Del(P), it does not self-intersect ⇒ global homeomorphism

slide-46
SLIDE 46

Computing the restricted Delaunay

Q How to compute DelS(P) when S is unknown?

slide-47
SLIDE 47

Computing the restricted Delaunay

Q How to compute DelS(P) when S is unknown?

→ a whole family of algorithms use various Delaunay extraction criteria:

  • crust
  • cocone
  • tight cocone
  • · · ·
  • power crust
slide-48
SLIDE 48

Crust algorithm

[Amenta et al. 1997-98]

slide-49
SLIDE 49

Crust algorithm

  • 1. Compute Delaunay triangulation of P

[Amenta et al. 1997-98]

slide-50
SLIDE 50

Crust algorithm

  • 1. Compute Delaunay triangulation of P

[Amenta et al. 1997-98]

slide-51
SLIDE 51

Crust algorithm

  • 2. Compute poles (furthest Voronoi vertices)

[Amenta et al. 1997-98]

slide-52
SLIDE 52

Crust algorithm

  • 3. Add poles to the set of vertices

[Amenta et al. 1997-98]

slide-53
SLIDE 53

Crust algorithm

  • 3. Add poles to the set of vertices

[Amenta et al. 1997-98]

slide-54
SLIDE 54

Crust algorithm

  • 4. Keep Delaunay simplices whose vertices are in P

[Amenta et al. 1997-98]

slide-55
SLIDE 55

Crust algorithm

→ in 2-d, crust = DelS(P) ≈ S

[Amenta et al. 1997-98]

slide-56
SLIDE 56

Crust algorithm

→ in 2-d, crust = DelS(P) ≈ S

[Amenta et al. 1997-98]

slide-57
SLIDE 57

Crust algorithm

→ in 2-d, crust = DelS(P) ≈ S → in 3-d, crust ⊇ DelS(P) ≈ S

[Amenta et al. 1997-98]

slide-58
SLIDE 58

Crust algorithm

→ in 2-d, crust = DelS(P) ≈ S → in 3-d, crust ⊇ DelS(P) ≈ S ⇒ manifold extraction step in post-processing

[Amenta et al. 1997-98]

slide-59
SLIDE 59

transition : voici l’etat de l’art il y a quelques annees. On savait reconstruire des courbes dans le plan et des surfaces dans R3 lisses sans bruit avec garanties. Puis on a commence a s’interesser aux objets non lisses, non manifold, en toutes dimensions, avec du bruit. Et la, un probleme fondamental est apparu...

Back to the reconstruction paradigm

Q What do you see? Why?

slide-60
SLIDE 60

transition : voici l’etat de l’art il y a quelques annees. On savait reconstruire des courbes dans le plan et des surfaces dans R3 lisses sans bruit avec garanties. Puis on a commence a s’interesser aux objets non lisses, non manifold, en toutes dimensions, avec du bruit. Et la, un probleme fondamental est apparu...

Back to the reconstruction paradigm

Q What do you see? Why?

slide-61
SLIDE 61

transition : voici l’etat de l’art il y a quelques annees. On savait reconstruire des courbes dans le plan et des surfaces dans R3 lisses sans bruit avec garanties. Puis on a commence a s’interesser aux objets non lisses, non manifold, en toutes dimensions, avec du bruit. Et la, un probleme fondamental est apparu...

Back to the reconstruction paradigm

→ When the dimensionality of the data is unknown or there is noise, the reconstruction result depends on the scale at which the data is looked at. → need for multi-scale reconstruction techniques

slide-62
SLIDE 62

Multi-scale approach in a nutshell

→ build a one-parameter family of complexes approximating the input at various scales

slide-63
SLIDE 63

Multi-scale approach in a nutshell

→ build a one-parameter family of complexes approximating the input at various scales → connections with manifold learning and topological persistence

slide-64
SLIDE 64

the simplicial complex serves as an approximation Input: a finite point set W ⊂ Rn → resample W iteratively, and maintain a simplicial complex:

Multi-scale algorithm [Guibas, Oudot 2007]

slide-65
SLIDE 65

the simplicial complex serves as an approximation Input: a finite point set W ⊂ Rn → resample W iteratively, and maintain a simplicial complex: Let L := {p}, for some p ∈ W;

Multi-scale algorithm [Guibas, Oudot 2007]

slide-66
SLIDE 66

the simplicial complex serves as an approximation Input: a finite point set W ⊂ Rn → resample W iteratively, and maintain a simplicial complex: Let L := {p}, for some p ∈ W; while L W Let q := argmaxw∈W d(w, L);

Multi-scale algorithm [Guibas, Oudot 2007]

slide-67
SLIDE 67

the simplicial complex serves as an approximation Input: a finite point set W ⊂ Rn → resample W iteratively, and maintain a simplicial complex: Let L := {p}, for some p ∈ W; while L W L := L ∪ {q}; end while Let q := argmaxw∈W d(w, L); update simplicial complex;

Multi-scale algorithm [Guibas, Oudot 2007]

slide-68
SLIDE 68

the simplicial complex serves as an approximation Input: a finite point set W ⊂ Rn → resample W iteratively, and maintain a simplicial complex: Let L := {p}, for some p ∈ W; while L W L := L ∪ {q}; end while Let q := argmaxw∈W d(w, L); update simplicial complex;

Multi-scale algorithm [Guibas, Oudot 2007]

slide-69
SLIDE 69

the simplicial complex serves as an approximation Input: a finite point set W ⊂ Rn → resample W iteratively, and maintain a simplicial complex: Let L := {p}, for some p ∈ W; while L W L := L ∪ {q}; end while Let q := argmaxw∈W d(w, L); update simplicial complex; Output: the sequence of simplicial complexes

Multi-scale algorithm [Guibas, Oudot 2007]

slide-70
SLIDE 70

Here, X is any metric space Let L ⊆ Rd (landmarks) s.t. |L| < +∞ → maintain the witness complex CW (L) [de Silva 2003]: and W ⊆ Rd (witnesses)

The simplicial complex to maintain

slide-71
SLIDE 71

Here, X is any metric space Def. w ∈ W strongly witnesses [v0, · · · , vk] if w − vi = w − vj ≤ w − u for all i, j = 0, · · · , k and all u ∈ L \ {v0, · · · , vk} (Delaunay test) Let L ⊆ Rd (landmarks) s.t. |L| < +∞ → maintain the witness complex CW (L) [de Silva 2003]: and W ⊆ Rd (witnesses)

The simplicial complex to maintain

slide-72
SLIDE 72

Here, X is any metric space Def. w ∈ W weakly witnesses [v0, · · · , vk] if w−vi ≤ w − u for all i = 0, · · · , k and all u ∈ L \ {v0, · · · , vk}. Def. w ∈ W strongly witnesses [v0, · · · , vk] if w − vi = w − vj ≤ w − u for all i, j = 0, · · · , k and all u ∈ L \ {v0, · · · , vk} (Delaunay test) Let L ⊆ Rd (landmarks) s.t. |L| < +∞ → maintain the witness complex CW (L) [de Silva 2003]: and W ⊆ Rd (witnesses)

The simplicial complex to maintain

slide-73
SLIDE 73

Here, X is any metric space Def. w ∈ W weakly witnesses [v0, · · · , vk] if w−vi ≤ w − u for all i = 0, · · · , k and all u ∈ L \ {v0, · · · , vk}. Def. CW (L) is the largest abstract simplicial complex built

  • ver L, whose faces are weakly witnessed by points of W.

Def. w ∈ W strongly witnesses [v0, · · · , vk] if w − vi = w − vj ≤ w − u for all i, j = 0, · · · , k and all u ∈ L \ {v0, · · · , vk} (Delaunay test) Let L ⊆ Rd (landmarks) s.t. |L| < +∞ → maintain the witness complex CW (L) [de Silva 2003]: and W ⊆ Rd (witnesses)

The simplicial complex to maintain

slide-74
SLIDE 74

⇒ CW (L) is a subcomplex of Del(L) ⇒ CW (L) is embedded in Rd

  • Thm. 1 [de Silva 2003]

∀W, L, ∀σ ∈ CW (L), ∃c ∈ Rd that strongly witnesses σ.

The witness complex (properties)

slide-75
SLIDE 75

every point of W witnesses exactly one simplex of each dimension ⇒ CW (L) is a subcomplex of Del(L) ⇒ CW (L) is embedded in Rd

  • Thm. 1 [de Silva 2003]

∀W, L, ∀σ ∈ CW (L), ∃c ∈ Rd that strongly witnesses σ.

  • Thm. 2 [de Silva, Carlsson 2004]
  • The size of CW (L) is O(d|W|)
  • The time to compute CW (L) is O(d|W||L|)

The witness complex (properties)

slide-76
SLIDE 76

every point of W witnesses exactly one simplex of each dimension ⇒ CW (L) is a subcomplex of Del(L) ⇒ CW (L) is embedded in Rd

  • Thm. 1 [de Silva 2003]

∀W, L, ∀σ ∈ CW (L), ∃c ∈ Rd that strongly witnesses σ.

  • Thm. 2 [de Silva, Carlsson 2004]
  • The size of CW (L) is O(d|W|)
  • The time to compute CW (L) is O(d|W||L|)
  • Thm. 3 [Guibas, Oudot 2007]

[Attali, Edelsbrunner, Mileyko 2007] Under some conditions, CW (L) = DelS(L) ≈ S

The witness complex (properties)

slide-77
SLIDE 77

The witness complex (properties)

→ connection with reconstruction:

  • W ⊂ Rd is given as input
  • L ⊆ W is generated
  • underlying manifold S unknown
  • only distance comparisons

⇒ algorithm is applicable in any metric space

slide-78
SLIDE 78

Argument: CW (L) ⊆ Del(L), whose simplices have dimension at most n

The witness complex (properties)

→ connection with reconstruction:

  • W ⊂ Rd is given as input
  • L ⊆ W is generated
  • underlying manifold S unknown
  • only distance comparisons

⇒ space ≤ O (d|W|) time ≤ O

  • d|W|2

⇒ algorithm is applicable in any metric space

  • In Rd, CW (L) can be maintained by updating,

for each witness w, the list of d + 1 nearest land- marks of w.

slide-79
SLIDE 79

Input: a finite point set W ⊂ Rd.

The full algorithm

slide-80
SLIDE 80

Input: a finite point set W ⊂ Rd. Init: L := {p}; construct lists of nearest landmarks; CW (L) = {[p]}; Invariant: ∀w ∈ W, the list of d + 1 nearest landmarks of w is maintained throughout the process.

The full algorithm

slide-81
SLIDE 81

Input: a finite point set W ⊂ Rd. Init: L := {p}; construct lists of nearest landmarks; CW (L) = {[p]}; Invariant: ∀w ∈ W, the list of d + 1 nearest landmarks of w is maintained throughout the process. while L W

The full algorithm

slide-82
SLIDE 82

Input: a finite point set W ⊂ Rd. Init: L := {p}; construct lists of nearest landmarks; CW (L) = {[p]}; Invariant: ∀w ∈ W, the list of d + 1 nearest landmarks of w is maintained throughout the process. while L W insert argmaxw∈W d(w, L) in L; update the lists of nearest neighbors;

The full algorithm

slide-83
SLIDE 83

Input: a finite point set W ⊂ Rd. Init: L := {p}; construct lists of nearest landmarks; CW (L) = {[p]}; Invariant: ∀w ∈ W, the list of d + 1 nearest landmarks of w is maintained throughout the process. while L W insert argmaxw∈W d(w, L) in L; update the lists of nearest neighbors; update CW (L); end while

The full algorithm

slide-84
SLIDE 84

Input: a finite point set W ⊂ Rd. Init: L := {p}; construct lists of nearest landmarks; CW (L) = {[p]}; Invariant: ∀w ∈ W, the list of d + 1 nearest landmarks of w is maintained throughout the process. while L W insert argmaxw∈W d(w, L) in L; update the lists of nearest neighbors; update CW (L); end while

The full algorithm

slide-85
SLIDE 85

Input: a finite point set W ⊂ Rd. Init: L := {p}; construct lists of nearest landmarks; CW (L) = {[p]}; Invariant: ∀w ∈ W, the list of d + 1 nearest landmarks of w is maintained throughout the process. while L W insert argmaxw∈W d(w, L) in L; update the lists of nearest neighbors; update CW (L); end while Output: the sequence of complexes CW (L)

The full algorithm

slide-86
SLIDE 86

Conjecture [Carlsson, de Silva 2004]: CW (L) coincides with DelS(L)...

Theoretical guarantees

→ case of curves:

slide-87
SLIDE 87

Conjecture [Carlsson, de Silva 2004]: CW (L) coincides with DelS(L)... ... under some conditions on W and L

Theoretical guarantees

→ case of curves:

slide-88
SLIDE 88
  • Thm. 3

If S is a closed curve with positive reach, W ⊂ Rd s.t. dH(W, S) ≤ δ, L ⊆ W ε-sparse ε-sample of W with δ < < ε < < ̺S, then CW (L) = DelS(L) ≈ S.

> ε

Theoretical guarantees

→ case of curves:

slide-89
SLIDE 89

talk about stabilization of topological invariants, e.g. Betti numbers (number of CCs and holes here)

  • Thm. 3

If S is a closed curve with positive reach, W ⊂ Rd s.t. dH(W, S) ≤ δ, L ⊆ W ε-sparse ε-sample of W with δ < < ε < < ̺S, then CW (L) = DelS(L) ≈ S.

> ε

1/ε 1/̺S 1/δ 1/εr 1/εl

εl β1 β0

1 2

εr

→ There is a plateau in the diagram of Betti numbers of CW (L).

Theoretical guarantees

→ case of curves:

slide-90
SLIDE 90
  • DelS(L) ⊆ CW (L)
  • Thm. 3

If S is a closed curve with positive reach, W ⊂ Rd s.t. dH(W, S) ≤ δ, L ⊆ W ε-sparse ε-sample of W with δ < < ε < < ̺S, then CW (L) = DelS(L) ≈ S.

> ε

Theoretical guarantees

→ case of curves:

slide-91
SLIDE 91
  • DelS(L) ⊆ CW (L)
  • Thm. 3

If S is a closed curve with positive reach, W ⊂ Rd s.t. dH(W, S) ≤ δ, L ⊆ W ε-sparse ε-sample of W with δ < < ε < < ̺S, then CW (L) = DelS(L) ≈ S.

> ε δ

Theoretical guarantees

→ case of curves:

slide-92
SLIDE 92
  • DelS(L) ⊆ CW (L)
  • Thm. 3

If S is a closed curve with positive reach, W ⊂ Rd s.t. dH(W, S) ≤ δ, L ⊆ W ε-sparse ε-sample of W with δ < < ε < < ̺S, then CW (L) = DelS(L) ≈ S.

> ε

Theoretical guarantees

→ case of curves:

slide-93
SLIDE 93
  • DelS(L) ⊆ CW (L)
  • Thm. 3

If S is a closed curve with positive reach, W ⊂ Rd s.t. dH(W, S) ≤ δ, L ⊆ W ε-sparse ε-sample of W with δ < < ε < < ̺S, then CW (L) = DelS(L) ≈ S.

> ε

  • CW (L) ⊆ DelS(L)

Theoretical guarantees

→ case of curves:

slide-94
SLIDE 94

ε = 0.2, ̺S ≈ 0.25 Thm [Attali, Edelsbrunner, Mileyko] If ε < < ̺S, then ∀W ⊆ S, CW (L) ⊆ DelS(L). ⇒ CS(L) = DelS(L)

Theoretical guarantees

→ case of surfaces:

slide-95
SLIDE 95

ε = 0.2, ̺S ≈ 0.25 Thm [Attali, Edelsbrunner, Mileyko] If ε < < ̺S, then ∀W ⊆ S, CW (L) ⊆ DelS(L). ⇒ CS(L) = DelS(L)

Pb DelS(L) CW (L) if W S

Theoretical guarantees

→ case of surfaces:

slide-96
SLIDE 96

ε = 0.2, ̺S ≈ 0.25 Thm [Attali, Edelsbrunner, Mileyko] If ε < < ̺S, then ∀W ⊆ S, CW (L) ⊆ DelS(L).

  • rder-2 Voronoi diagram

⇒ CS(L) = DelS(L)

Pb DelS(L) CW (L) if W S

Theoretical guarantees

→ case of surfaces:

slide-97
SLIDE 97

ε = 0.2, ̺S ≈ 0.25 Thm [Attali, Edelsbrunner, Mileyko] If ε < < ̺S, then ∀W ⊆ S, CW (L) ⊆ DelS(L).

  • rder-2 Voronoi diagram

⇒ CS(L) = DelS(L)

Pb DelS(L) CW (L) if W S

Theoretical guarantees

→ case of surfaces:

slide-98
SLIDE 98

Thm [Attali, Edelsbrunner, Mileyko] If ε < < ̺S, then ∀W ⊆ S, CW (L) ⊆ DelS(L). Solution relax witness test [Guibas, Oudot] ⇒ CW

ν (L) = DelS(L)+ slivers

⇒ CW

ν (L) Del(L)

⇒ CW

ν (L) not embedded.

Post-process extract manifold M from CW

ν (L) ∩ Del(L)

[Amenta, Choi, Dey, Leekha] ⇒ CS(L) = DelS(L)

Pb DelS(L) CW (L) if W S

Theoretical guarantees

→ case of surfaces:

slide-99
SLIDE 99

Some results

slide-100
SLIDE 100

1 2 3

Some results

slide-101
SLIDE 101

1 2 3

Some results

slide-102
SLIDE 102

1 2 3

Some results

slide-103
SLIDE 103

1 2 3

Some results

slide-104
SLIDE 104

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-105
SLIDE 105

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-106
SLIDE 106

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-107
SLIDE 107

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-108
SLIDE 108

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-109
SLIDE 109

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-110
SLIDE 110

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-111
SLIDE 111

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-112
SLIDE 112

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-113
SLIDE 113

input model provided courtesy of IMATI by the Aim@Shape repository

Some results (cont’d)

slide-114
SLIDE 114

input data set courtesy of the Graphics Lab@Stanford

Some results (cont’d)

slide-115
SLIDE 115

input data set courtesy of the Graphics Lab@Stanford

Some results (cont’d)

slide-116
SLIDE 116

input data set courtesy of the Graphics Lab@Stanford

Some results (cont’d)

slide-117
SLIDE 117

Some results (cont’d) Some results (cont’d) Some results (cont’d)

slide-118
SLIDE 118

Under some sampling conditions, CW(L) = DelS(L) ≈ S

→ Carlsson and de Silva’s conjecture:

Higher dimensions

slide-119
SLIDE 119

Under some sampling conditions, CW(L) = DelS(L) ≈ S

non longer true → Carlsson and de Silva’s conjecture:

  • DelS(L) may not be included in CW (L)
  • n d-manifolds, d ≥ 2

[Guibas, Oudot]

Higher dimensions

slide-120
SLIDE 120

Under some sampling conditions, CW(L) = DelS(L) ≈ S

non longer true → Carlsson and de Silva’s conjecture:

  • CW (L) may not be included in DelS(L)
  • n d-manifolds, d ≥ 3

→ source of problems:

  • DelS(L) may not be included in CW (L)
  • n d-manifolds, d ≥ 2

slivers

  • DelS(L) may not be homeomorphic to S,

nor even homotopy equivalent

[Guibas, Oudot] [Oudot] [Oudot]

u v w p

Higher dimensions

slide-121
SLIDE 121

Under some sampling conditions, CW(L) = DelS(L) ≈ S

non longer true → Carlsson and de Silva’s conjecture:

  • CW (L) may not be included in DelS(L)
  • n d-manifolds, d ≥ 3

→ source of problems:

  • DelS(L) may not be included in CW (L)
  • n d-manifolds, d ≥ 2

slivers

  • DelS(L) may not be homeomorphic to S,

nor even homotopy equivalent

[Guibas, Oudot] [Oudot] [Oudot]

dilate W so that it includes S assign weights to the landmarks to remove slivers [Cheng, Dey, Ramos]

Higher dimensions

[Boissonnat, Guibas, Oudot]

slide-122
SLIDE 122

there are not only some theoretical bottlenecks, such as the ones mentioned here, but also some crucial algorithmic issues

Under some sampling conditions, CW(L) = DelS(L) ≈ S

non longer true → Carlsson and de Silva’s conjecture:

  • CW (L) may not be included in DelS(L)
  • n d-manifolds, d ≥ 3

→ source of problems:

  • DelS(L) may not be included in CW (L)
  • n d-manifolds, d ≥ 2

slivers

  • DelS(L) may not be homeomorphic to S,

nor even homotopy equivalent

[Guibas, Oudot] [Oudot] [Oudot]

dilate W so that it includes S assign weights to the landmarks to remove slivers [Cheng, Dey, Ramos]

Higher dimensions

[Boissonnat, Guibas, Oudot]

Higher-dimensional reconstruction is still widely open