Simultaneous Nearest Neighbor Search Piotr Indyk Robert Kleinberg - - PowerPoint PPT Presentation

β–Ά
simultaneous nearest neighbor search
SMART_READER_LITE
LIVE PREVIEW

Simultaneous Nearest Neighbor Search Piotr Indyk Robert Kleinberg - - PowerPoint PPT Presentation

Simultaneous Nearest Neighbor Search Piotr Indyk Robert Kleinberg MIT Cornell Sepideh Mahabadi Yang Yuan MIT Cornell Nearest Neighbor Dataset of points in a metric space (, ) 6/17/2016 2 Nearest Neighbor


slide-1
SLIDE 1

Simultaneous Nearest Neighbor Search

Yang Yuan Cornell Robert Kleinberg Cornell Piotr Indyk MIT Sepideh Mahabadi MIT

slide-2
SLIDE 2

Nearest Neighbor

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)

6/17/2016 2

slide-3
SLIDE 3

Nearest Neighbor

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • A query point comes online π‘Ÿ

6/17/2016 3

π‘Ÿ

slide-4
SLIDE 4

Nearest Neighbor

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • A query point comes online π‘Ÿ
  • Goal:
  • Find the nearest data-set point π‘žβˆ—

6/17/2016 4

π‘Ÿ π‘žβˆ—

slide-5
SLIDE 5

Nearest Neighbor

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • A query point comes online π‘Ÿ
  • Goal:
  • Find the nearest data-set point π‘žβˆ—
  • Do it in sub-linear time

6/17/2016 5

π‘Ÿ π‘žβˆ—

slide-6
SLIDE 6

Approximate Nearest Neighbor

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • A query point comes online π‘Ÿ
  • Goal:
  • Find the nearest data-set point π‘žβˆ—
  • Do it in sub-linear time
  • Approximate Nearest Neighbor

6/17/2016 6

π‘Ÿ π‘žβˆ— π‘ž

slide-7
SLIDE 7

What if

We have multiple queries We need the results of the queries to be related.

6/17/2016 7

slide-8
SLIDE 8

What if

We have multiple queries We need the results of the queries to be related. Example:

  • Noisy image
  • For each pixel find the true color
  • Neighboring pixels have similar color

6/17/2016 8

slide-9
SLIDE 9

Simultaneous NN Problem

6/17/2016 Sepideh Mahabadi 9

slide-10
SLIDE 10

The SNN problem

(Felzenszwalb’15)

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)

6/17/2016 10

slide-11
SLIDE 11

The SNN problem

(Felzenszwalb’15)

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • Query comes online and contains
  • 𝑙 query points 𝑅 = (π‘Ÿ1, … , π‘Ÿπ‘™)

6/17/2016 11

π‘Ÿ1 π‘Ÿ2 π‘Ÿ3

slide-12
SLIDE 12

The SNN problem

(Felzenszwalb’15)

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • Query comes online and contains
  • 𝑙 query points 𝑅 = (π‘Ÿ1, … , π‘Ÿπ‘™)
  • A compatibility graph 𝐻 = 𝑅, 𝐹𝐻

6/17/2016 12

π‘Ÿ1 π‘Ÿ2 π‘Ÿ3

slide-13
SLIDE 13

The SNN problem

(Felzenszwalb’15)

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • Query comes online and contains
  • 𝑙 query points 𝑅 = (π‘Ÿ1, … , π‘Ÿπ‘™)
  • A compatibility graph 𝐻 = 𝑅, 𝐹𝐻

6/17/2016 13

π‘Ÿ1 π‘Ÿ2 π‘Ÿ3

slide-14
SLIDE 14

The SNN problem

(Felzenszwalb’15)

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • Query comes online and contains
  • 𝑙 query points 𝑅 = (π‘Ÿ1, … , π‘Ÿπ‘™)
  • A compatibility graph 𝐻 = 𝑅, 𝐹𝐻

6/17/2016 14

π‘Ÿ1 π‘Ÿ2 π‘Ÿ3

slide-15
SLIDE 15

The SNN problem

(Felzenszwalb’15)

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • Query comes online and contains
  • 𝑙 query points 𝑅 = (π‘Ÿ1, … , π‘Ÿπ‘™)
  • A compatibility graph 𝐻 = 𝑅, 𝐹𝐻
  • Goal is to report (π‘ž1, … , π‘žπ‘™) , π‘žπ‘— ∈ 𝑄 , that minimizes

𝒋=𝟐

𝒍

𝒆𝒀(𝒓𝒋, 𝒒𝒋) + 𝒓𝒋,π’“π’Œ βˆˆπ‘­π‘― 𝒆𝒀(𝒒𝒋, π’’π’Œ)

6/17/2016 15

π‘Ÿ1 π‘Ÿ2 π‘Ÿ3 π‘ž1 π‘ž2 π‘ž3

slide-16
SLIDE 16

The Generalized SNN

  • Dataset of π‘œ points 𝑄 in a metric space (π‘Œ, π‘’π‘Œ)
  • Query comes online and contains
  • 𝑙 query points 𝑅 = (π‘Ÿ1, … , π‘Ÿπ‘™)
  • A compatibility graph 𝐻 = 𝑅, 𝐹𝐻
  • Goal is to report (π‘ž1, … , π‘žπ‘™) , π‘žπ‘— ∈ 𝑄 , that minimizes

𝒋=𝟐

𝒍

𝝀𝒋𝑒𝒁(𝒓𝒋, 𝒒𝒋) + 𝒓𝒋,π’“π’Œ βˆˆπ‘­π‘― 𝝁𝒋,π’Œπ’†π’€(𝒒𝒋, π’’π’Œ)

6/17/2016 16

π‘Ÿ1 π‘Ÿ2 π‘Ÿ3 π‘ž1 π‘ž2 π‘ž3

slide-17
SLIDE 17

Independent NN Algorithm

6/17/2016 Sepideh Mahabadi 17

slide-18
SLIDE 18

Independent NN Algorithm

INN Algorithm

  • For each query point π‘Ÿπ‘— ∈ 𝑅

6/17/2016 18

π‘Ÿ1 π‘Ÿ3 π‘Ÿ2

slide-19
SLIDE 19

Independent NN Algorithm

INN Algorithm

  • For each query point π‘Ÿπ‘— ∈ 𝑅
  • Independently find a (approximate) nearest neighbor

𝒒𝒋 (Searching step)

6/17/2016 19

π‘Ÿ1 π‘Ÿ2 π‘Ÿ3 π‘ž1 π‘ž2 π‘ž3

slide-20
SLIDE 20

Independent NN Algorithm

INN Algorithm

  • For each query point π‘Ÿπ‘— ∈ 𝑅
  • Independently find a (approximate) nearest neighbor

𝒒𝒋 (Searching step)

  • Replace the label set 𝑄 with the reduced set

𝑸 = { π’’πŸ, … , 𝒒𝒍} (Pruning step)

6/17/2016 20

π‘Ÿ1 π‘Ÿ3 π‘ž1 π‘ž2 π‘ž3 π‘Ÿ2

slide-21
SLIDE 21

Independent NN Algorithm

INN Algorithm

  • For each query point π‘Ÿπ‘— ∈ 𝑅
  • Independently find a (approximate) nearest neighbor

𝒒𝒋 (Searching step)

  • Replace the label set 𝑄 with the reduced set

𝑸 = { π’’πŸ, … , 𝒒𝒍} (Pruning step)

  • Solve the problem for

𝑄

6/17/2016 21

π‘Ÿ1 π‘Ÿ3 π‘ž1 π‘ž2 π‘ž3 π‘Ÿ2

slide-22
SLIDE 22

Independent NN Algorithm

INN Algorithm

  • For each query point π‘Ÿπ‘— ∈ 𝑅
  • Independently find a (approximate) nearest neighbor

𝒒𝒋 (Searching step)

  • Replace the label set 𝑄 with the reduced set

𝑸 = { π’’πŸ, … , 𝒒𝒍} (Pruning step)

  • Solve the problem for

𝑄

  • Reduces the size of labels from π‘œ down to 𝑙

6/17/2016 22

π‘Ÿ1 π‘Ÿ3 π‘ž1 π‘ž2 π‘ž3 π‘Ÿ2

slide-23
SLIDE 23

Independent NN Algorithm

INN Algorithm

  • For each query point π‘Ÿπ‘— ∈ 𝑅
  • Independently find a (approximate) nearest neighbor

𝒒𝒋 (Searching step)

  • Replace the label set 𝑄 with the reduced set

𝑸 = { π’’πŸ, … , 𝒒𝒍} (Pruning step)

  • Solve the problem for

𝑄

  • Reduces the size of labels from π‘œ down to 𝑙
  • The optimal value increases by a factor 𝜷
  • pruning gap

6/17/2016 23

π‘Ÿ1 π‘Ÿ3 π‘ž1 π‘ž2 π‘ž3 π‘Ÿ2

slide-24
SLIDE 24

Independent NN Algorithm

INN Algorithm

  • For each query point π‘Ÿπ‘— ∈ 𝑅
  • Independently find a (approximate) nearest neighbor

𝒒𝒋 (Searching step)

  • Replace the label set 𝑄 with the reduced set

𝑸 = { π’’πŸ, … , 𝒒𝒍} (Pruning step)

  • Solve the problem for

𝑄

  • Reduces the size of labels from π‘œ down to 𝑙
  • The optimal value increases by a factor 𝜷
  • pruning gap
  • Any metric labeling 𝛾-approximate algorithm can be used on the

reduced set , giving us (𝛽 β‹… 𝛾)-approximate algorithm.

6/17/2016 24

π‘Ÿ1 π‘Ÿ3 π‘ž1 π‘ž2 π‘ž3 π‘Ÿ2

slide-25
SLIDE 25

Results

6/17/2016 Sepideh Mahabadi 25

slide-26
SLIDE 26

Results

  • Prove bounds for the pruning gap

6/17/2016 26

slide-27
SLIDE 27

Results

  • Prove bounds for the pruning gap
  • 𝜷 = 𝑷

𝐦𝐩𝐑 𝒍 𝐦𝐩𝐑 𝐦𝐩𝐑 π₯

6/17/2016 27

slide-28
SLIDE 28

Results

  • Prove bounds for the pruning gap
  • 𝜷 = 𝑷

𝐦𝐩𝐑 𝒍 𝐦𝐩𝐑 𝐦𝐩𝐑 π₯

  • 𝜷 = 𝛁

𝐦𝐩𝐑 𝒍

6/17/2016 28

slide-29
SLIDE 29

Results

  • Prove bounds for the pruning gap
  • 𝜷 = 𝑷

𝐦𝐩𝐑 𝒍 𝐦𝐩𝐑 𝐦𝐩𝐑 π₯

  • 𝜷 = 𝛁

𝐦𝐩𝐑 𝒍

  • For 𝑠-sparse graph: 𝜷 = 𝑷 𝒔

6/17/2016 29

slide-30
SLIDE 30

Results

  • Prove bounds for the pruning gap
  • 𝜷 = 𝑷

𝐦𝐩𝐑 𝒍 𝐦𝐩𝐑 𝐦𝐩𝐑 π₯

  • 𝜷 = 𝛁

𝐦𝐩𝐑 𝒍

  • For 𝑠-sparse graph: 𝜷 = 𝑷 𝒔
  • Graphs with pseudo-arboricity 𝑠: each edge can be mapped

to a vertex such that at most 𝑠 edges are mapped to any vertex

6/17/2016 30

slide-31
SLIDE 31

Results

  • Prove bounds for the pruning gap
  • 𝜷 = 𝑷

𝐦𝐩𝐑 𝒍 𝐦𝐩𝐑 𝐦𝐩𝐑 π₯

  • 𝜷 = 𝛁

𝐦𝐩𝐑 𝒍

  • For 𝑠-sparse graph: 𝜷 = 𝑷 𝒔
  • Graphs with pseudo-arboricity 𝑠: each edge can be mapped

to a vertex such that at most 𝑠 edges are mapped to any vertex

  • Would mean constant approximation factor for trees, grids,

planar graphs, …, and in particular 𝑃(𝑠)-approximation for 𝑠- degree graphs

6/17/2016 31

slide-32
SLIDE 32

Results

  • Prove bounds for the pruning gap
  • 𝜷 = 𝑷

𝐦𝐩𝐑 𝒍 𝐦𝐩𝐑 𝐦𝐩𝐑 π₯

  • 𝜷 = 𝛁

𝐦𝐩𝐑 𝒍

  • For 𝑠-sparse graph: 𝜷 = 𝑷 𝒔
  • Graphs with pseudo-arboricity 𝑠: each edge can be mapped

to a vertex such that at most 𝑠 edges are mapped to any vertex

  • Would mean constant approximation factor for trees, grids,

planar graphs, …, and in particular 𝑃(𝑠)-approximation for 𝑠- degree graphs

  • 𝜷 is very close to one in experiments

6/17/2016 32

slide-33
SLIDE 33

Results

  • Prove bounds for the pruning gap
  • 𝜷 = 𝑷

𝐦𝐩𝐑 𝒍 𝐦𝐩𝐑 𝐦𝐩𝐑 π₯

  • 𝜷 = 𝛁

𝐦𝐩𝐑 𝒍

  • For 𝑠-sparse graph: 𝜷 = 𝑷 𝒔
  • Graphs with pseudo-arboricity 𝑠: each edge can be mapped

to a vertex such that at most 𝑠 edges are mapped to any vertex

  • Would mean constant approximation factor for trees, grids,

planar graphs, …, and in particular 𝑃(𝑠)-approximation for 𝑠- degree graphs

  • 𝜷 is very close to one in experiments

6/17/2016 33

slide-34
SLIDE 34

Overview of the proof for

𝜷 = 𝑷 𝐦𝐩𝐑 𝒍 𝐦𝐩𝐑 𝐦𝐩𝐑 𝒍

6/17/2016 Sepideh Mahabadi 34

slide-35
SLIDE 35

0-Extension Problem [Kar98]

6/17/2016 35

slide-36
SLIDE 36

0-Extension Problem [Kar98]

  • The input:
  • a graph 𝐼 π‘Š, 𝐹

6/17/2016 36

slide-37
SLIDE 37

0-Extension Problem [Kar98]

  • The input:
  • a graph 𝐼 π‘Š, 𝐹
  • a weight function π‘₯ 𝑓

6/17/2016 37

2 1 2 1 2

slide-38
SLIDE 38

0-Extension Problem [Kar98]

  • The input:
  • a graph 𝐼 π‘Š, 𝐹
  • a weight function π‘₯ 𝑓
  • a set of terminals π‘ˆ βŠ‚ π‘Š

6/17/2016 38

2 1 2 1 2

slide-39
SLIDE 39

0-Extension Problem [Kar98]

  • The input:
  • a graph 𝐼 π‘Š, 𝐹
  • a weight function π‘₯ 𝑓
  • a set of terminals π‘ˆ βŠ‚ π‘Š
  • The goal: find a mapping 𝑔: π‘Š β†’ π‘ˆ s.t.
  • Each terminal is mapped to itself
  • Minimize 𝑣,𝑀 ∈𝐹 π‘₯ 𝑣, 𝑀 𝑒(𝑔 𝑣 , 𝑔 𝑀 )

6/17/2016 39

2 1 2 1 2

Cost = 1 β‹… 𝑒(𝑒1, 𝑒2)

slide-40
SLIDE 40

0-Extension Problem

Prior work: [CKR05, FHRT03, AFHKTT04, LN04]

6/17/2016 40

slide-41
SLIDE 41

0-Extension Problem

Prior work: [CKR05, FHRT03, AFHKTT04, LN04]

  • Upper bounds:
  • E.g. 𝑃(

log |π‘ˆ| log log |π‘ˆ|) approximation algorithm [CKR05]

6/17/2016 41

slide-42
SLIDE 42

0-Extension Problem

Prior work: [CKR05, FHRT03, AFHKTT04, LN04]

  • Upper bounds:
  • E.g. 𝑃(

log |π‘ˆ| log log |π‘ˆ|) approximation algorithm [CKR05]

  • consider the metric relaxation of the LP for the

problem

  • Solve LP
  • Round the solution

6/17/2016 42

slide-43
SLIDE 43

0-Extension Problem

Prior work: [CKR05, FHRT03, AFHKTT04, LN04]

  • Upper bounds:
  • E.g. 𝑃(

log |π‘ˆ| log log |π‘ˆ|) approximation algorithm [CKR05]

  • consider the metric relaxation of the LP for the

problem

  • Solve LP
  • Round the solution
  • Ξ©( log |π‘ˆ|) integrality gap [FHRT03]

6/17/2016 43

slide-44
SLIDE 44

0-Extension Problem

Prior work: [CKR05, FHRT03, AFHKTT04, LN04]

  • Upper bounds:
  • E.g. 𝑃(

log |π‘ˆ| log log |π‘ˆ|) approximation algorithm [CKR05]

  • consider the metric relaxation of the LP for the

problem

  • Solve LP
  • Round the solution
  • Ξ©( log |π‘ˆ|) integrality gap [FHRT03]
  • Efficient if the number of terminals is low

6/17/2016 44

slide-45
SLIDE 45

Connection to SNN

  • SNN: (𝑸, 𝑹, 𝑯)
  • 0-Extension: (𝑾, 𝑰, 𝒙, 𝑼)

6/17/2016 45

slide-46
SLIDE 46

Connection to SNN

  • SNN: (𝑸, 𝑹, 𝑯)
  • 0-Extension: (𝑾, 𝑰, 𝒙, 𝑼)

1. 0-extension can be solved using generalized SNN

  • 𝑅 = π‘Š , 𝑄 = π‘ˆ , πœ‡π‘—,π‘˜ = π‘₯(𝑗, π‘˜) , πœ†π‘— = ∞ 𝑗𝑔 π‘Ÿπ‘— ∈ π‘ˆ π‘π‘œπ‘’ 0 𝑃. π‘₯.

6/17/2016 46

slide-47
SLIDE 47

Connection to SNN

  • SNN: (𝑸, 𝑹, 𝑯)
  • 0-Extension: (𝑾, 𝑰, 𝒙, 𝑼)

1. 0-extension can be solved using generalized SNN

  • 𝑅 = π‘Š , 𝑄 = π‘ˆ , πœ‡π‘—,π‘˜ = π‘₯(𝑗, π‘˜) , πœ†π‘— = ∞ 𝑗𝑔 π‘Ÿπ‘— ∈ π‘ˆ π‘π‘œπ‘’ 0 𝑃. π‘₯.

2. SNN can be solved using 0-extension in a black-box manner

  • Set: π‘ˆ = 𝑄 , π‘Š = 𝑅 βˆͺ 𝑄 , π‘₯ = 1 , 𝐼 = 𝐻 βˆͺ

π‘Ÿπ‘—, π‘žπ‘— 𝑗}

6/17/2016 47

slide-48
SLIDE 48

Connection to SNN

  • SNN: (𝑸, 𝑹, 𝑯)
  • 0-Extension: (𝑾, 𝑰, 𝒙, 𝑼)

1. 0-extension can be solved using generalized SNN

  • 𝑅 = π‘Š , 𝑄 = π‘ˆ , πœ‡π‘—,π‘˜ = π‘₯(𝑗, π‘˜) , πœ†π‘— = ∞ 𝑗𝑔 π‘Ÿπ‘— ∈ π‘ˆ π‘π‘œπ‘’ 0 𝑃. π‘₯.

2. SNN can be solved using 0-extension in a black-box manner

  • Set: π‘ˆ = 𝑄 , π‘Š = 𝑅 βˆͺ 𝑄 , π‘₯ = 1 , 𝐼 = 𝐻 βˆͺ

π‘Ÿπ‘—, π‘žπ‘— 𝑗}

  • giving 𝑷(

π’Žπ’‘π’‰ 𝒐 π’Žπ’‘π’‰ π’Žπ’‘π’‰ 𝒐) approximation algorithm

6/17/2016 48

slide-49
SLIDE 49

Connection to SNN

  • SNN: (𝑸, 𝑹, 𝑯)
  • 0-Extension: (𝑾, 𝑰, 𝒙, 𝑼)

1. 0-extension can be solved using generalized SNN

  • 𝑅 = π‘Š , 𝑄 = π‘ˆ , πœ‡π‘—,π‘˜ = π‘₯(𝑗, π‘˜) , πœ†π‘— = ∞ 𝑗𝑔 π‘Ÿπ‘— ∈ π‘ˆ π‘π‘œπ‘’ 0 𝑃. π‘₯.

2. SNN can be solved using 0-extension in a black-box manner

  • Set: π‘ˆ = 𝑄 , π‘Š = 𝑅 βˆͺ 𝑄 , π‘₯ = 1 , 𝐼 = 𝐻 βˆͺ

π‘Ÿπ‘—, π‘žπ‘— 𝑗}

  • giving 𝑷(

π’Žπ’‘π’‰ 𝒐 π’Žπ’‘π’‰ π’Žπ’‘π’‰ 𝒐) approximation algorithm

3. Improve to depend only on 𝒍 not n

6/17/2016 49

slide-50
SLIDE 50

Connection to SNN

  • SNN: (𝑸, 𝑹, 𝑯)
  • 0-Extension: (𝑾, 𝑰, 𝒙, 𝑼)

1. 0-extension can be solved using generalized SNN

  • 𝑅 = π‘Š , 𝑄 = π‘ˆ , πœ‡π‘—,π‘˜ = π‘₯(𝑗, π‘˜) , πœ†π‘— = ∞ 𝑗𝑔 π‘Ÿπ‘— ∈ π‘ˆ π‘π‘œπ‘’ 0 𝑃. π‘₯.

2. SNN can be solved using 0-extension in a black-box manner

  • Set: π‘ˆ = 𝑄 , π‘Š = 𝑅 βˆͺ 𝑄 , π‘₯ = 1 , 𝐼 = 𝐻 βˆͺ

π‘Ÿπ‘—, π‘žπ‘— 𝑗}

  • giving 𝑷(

π’Žπ’‘π’‰ 𝒐 π’Žπ’‘π’‰ π’Žπ’‘π’‰ 𝒐) approximation algorithm

3. Improve to depend only on 𝒍 not n

  • Analyzing INN using 0-extension in a β€œgrey-box” manner

6/17/2016 50

slide-51
SLIDE 51

Connection to SNN

  • SNN: (𝑸, 𝑹, 𝑯)
  • 0-Extension: (𝑾, 𝑰, 𝒙, 𝑼)

1. 0-extension can be solved using generalized SNN

  • 𝑅 = π‘Š , 𝑄 = π‘ˆ , πœ‡π‘—,π‘˜ = π‘₯(𝑗, π‘˜) , πœ†π‘— = ∞ 𝑗𝑔 π‘Ÿπ‘— ∈ π‘ˆ π‘π‘œπ‘’ 0 𝑃. π‘₯.

2. SNN can be solved using 0-extension in a black-box manner

  • Set: π‘ˆ = 𝑄 , π‘Š = 𝑅 βˆͺ 𝑄 , π‘₯ = 1 , 𝐼 = 𝐻 βˆͺ

π‘Ÿπ‘—, π‘žπ‘— 𝑗}

  • giving 𝑷(

π’Žπ’‘π’‰ 𝒐 π’Žπ’‘π’‰ π’Žπ’‘π’‰ 𝒐) approximation algorithm

3. Improve to depend only on 𝒍 not n

  • Analyzing INN using 0-extension in a β€œgrey-box” manner
  • Using subtle properties of existing algorithms for 0-extension

6/17/2016 51

slide-52
SLIDE 52

Connection to SNN

  • SNN: (𝑸, 𝑹, 𝑯)
  • 0-Extension: (𝑾, 𝑰, 𝒙, 𝑼)

1. 0-extension can be solved using generalized SNN

  • 𝑅 = π‘Š , 𝑄 = π‘ˆ , πœ‡π‘—,π‘˜ = π‘₯(𝑗, π‘˜) , πœ†π‘— = ∞ 𝑗𝑔 π‘Ÿπ‘— ∈ π‘ˆ π‘π‘œπ‘’ 0 𝑃. π‘₯.

2. SNN can be solved using 0-extension in a black-box manner

  • Set: π‘ˆ = 𝑄 , π‘Š = 𝑅 βˆͺ 𝑄 , π‘₯ = 1 , 𝐼 = 𝐻 βˆͺ

π‘Ÿπ‘—, π‘žπ‘— 𝑗}

  • giving 𝑷(

π’Žπ’‘π’‰ 𝒐 π’Žπ’‘π’‰ π’Žπ’‘π’‰ 𝒐) approximation algorithm

3. Improve to depend only on 𝒍 not n

  • Analyzing INN using 0-extension in a β€œgrey-box” manner
  • Using subtle properties of existing algorithms for 0-extension
  • Leads to an 𝑷(

π’Žπ’‘π’‰ 𝒍 π’Žπ’‘π’‰ π’Žπ’‘π’‰ 𝒍) approximation

6/17/2016 52

slide-53
SLIDE 53

Experiments

6/17/2016 Sepideh Mahabadi 53

slide-54
SLIDE 54

Experimental Results

  • De-noising problem
  • Each pixel is a query point
  • Data set 𝑄 : all 256 3 possible colors
  • Graph: the grid

6/17/2016 54

slide-55
SLIDE 55

Experimental Results

  • De-noising problem
  • Each pixel is a query point
  • Data set 𝑄 : all 256 3 possible colors
  • Graph: the grid
  • Algorithm
  • Only consider the colors that appear in the noisy image

6/17/2016 55

slide-56
SLIDE 56

Experimental Results

  • De-noising problem
  • Each pixel is a query point
  • Data set 𝑄 : all 256 3 possible colors
  • Graph: the grid
  • Algorithm
  • Only consider the colors that appear in the noisy image
  • Result: empirical pruning gap 𝛽 is very close to 1,

at most 1.024

6/17/2016 56

slide-57
SLIDE 57

Experimental Results

6/17/2016 57

Image Noisy Image De-noised using all colors De-noised using noisy image colors

slide-58
SLIDE 58

Experimental Results

6/17/2016 59

Image Half-Noisy De-noised

slide-59
SLIDE 59

Conclusion

  • Summary of Results

6/17/2016 60

slide-60
SLIDE 60

Conclusion

  • Summary of Results
  • Presented Independent NN pruning

6/17/2016 61

slide-61
SLIDE 61

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽

6/17/2016 62

slide-62
SLIDE 62

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽
  • 𝛽 = 𝑃(

log 𝑙 log log 𝑙) , 𝛽 = Ξ©( log 𝑙) 6/17/2016 63

slide-63
SLIDE 63

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽
  • 𝛽 = 𝑃(

log 𝑙 log log 𝑙) , 𝛽 = Ξ©( log 𝑙)

  • 𝛽 = 𝑃(1) for sparse graphs that are mostly used in applications

6/17/2016 64

slide-64
SLIDE 64

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽
  • 𝛽 = 𝑃(

log 𝑙 log log 𝑙) , 𝛽 = Ξ©( log 𝑙)

  • 𝛽 = 𝑃(1) for sparse graphs that are mostly used in applications
  • 𝛽 β‰ˆ 1 in the denoising experiments

6/17/2016 65

slide-65
SLIDE 65

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽
  • 𝛽 = 𝑃(

log 𝑙 log log 𝑙) , 𝛽 = Ξ©( log 𝑙)

  • 𝛽 = 𝑃(1) for sparse graphs that are mostly used in applications
  • 𝛽 β‰ˆ 1 in the denoising experiments
  • Open Problems

6/17/2016 66

slide-66
SLIDE 66

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽
  • 𝛽 = 𝑃(

log 𝑙 log log 𝑙) , 𝛽 = Ξ©( log 𝑙)

  • 𝛽 = 𝑃(1) for sparse graphs that are mostly used in applications
  • 𝛽 β‰ˆ 1 in the denoising experiments
  • Open Problems
  • Prove tighter bounds for 𝛽

6/17/2016 67

slide-67
SLIDE 67

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽
  • 𝛽 = 𝑃(

log 𝑙 log log 𝑙) , 𝛽 = Ξ©( log 𝑙)

  • 𝛽 = 𝑃(1) for sparse graphs that are mostly used in applications
  • 𝛽 β‰ˆ 1 in the denoising experiments
  • Open Problems
  • Prove tighter bounds for 𝛽
  • Get better guarantees using different algorithm, i.e., instead of picking the

closest point pick a few points.

6/17/2016 68

slide-68
SLIDE 68

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽
  • 𝛽 = 𝑃(

log 𝑙 log log 𝑙) , 𝛽 = Ξ©( log 𝑙)

  • 𝛽 = 𝑃(1) for sparse graphs that are mostly used in applications
  • 𝛽 β‰ˆ 1 in the denoising experiments
  • Open Problems
  • Prove tighter bounds for 𝛽
  • Get better guarantees using different algorithm, i.e., instead of picking the

closest point pick a few points.

  • Solve the general case of the problem, i.e.,
  • where the metrics 𝑒𝑍(π‘Ÿπ‘—, π‘žπ‘—) and π‘’π‘Œ(π‘žπ‘—, π‘žπ‘˜) are different
  • There are weights

6/17/2016 69

slide-69
SLIDE 69

Conclusion

  • Summary of Results
  • Presented Independent NN pruning
  • Induces an extra factor 𝛽
  • 𝛽 = 𝑃(

log 𝑙 log log 𝑙) , 𝛽 = Ξ©( log 𝑙)

  • 𝛽 = 𝑃(1) for sparse graphs that are mostly used in applications
  • 𝛽 β‰ˆ 1 in the denoising experiments
  • Open Problems
  • Prove tighter bounds for 𝛽
  • Get better guarantees using different algorithm, i.e., instead of picking the

closest point pick a few points.

  • Solve the general case of the problem, i.e.,
  • where the metrics π‘’π‘Œ(π‘Ÿπ‘—, π‘žπ‘—) and 𝑒𝑍(π‘žπ‘—, π‘žπ‘˜) are different
  • There are weights

6/17/2016 70

Thank You!