Submodular Observation Selection and Information Gathering for - - PowerPoint PPT Presentation

submodular observation selection and information
SMART_READER_LITE
LIVE PREVIEW

Submodular Observation Selection and Information Gathering for - - PowerPoint PPT Presentation

Submodular Observation Selection and Information Gathering for Quadratic Models Abolfazl Hashemi , Mahsa Ghasemi, Haris Vikalo, and Ufuk Topcu ICML, Wednesday June 12, 2019 Observation Selection and Information Gathering


slide-1
SLIDE 1

Submodular Observation Selection and Information Gathering for Quadratic Models

Abolfazl Hashemi∗, Mahsa Ghasemi, Haris Vikalo, and Ufuk Topcu

ICML, Wednesday June 12, 2019

slide-2
SLIDE 2

Observation Selection and Information Gathering

  • Resource-constrained inference problems
  • Target tracking, experimental design, sensor networks
  • Access to expensive / limited observations
  • Communication cost, power consumption, computational burden

Goal Cost-effectively identify the most useful subset of information

1/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-3
SLIDE 3

Observation Selection for Quadratic Models

  • Quadratic relation between observations and unknown parameters

yi = 1 2x⊤Zix + h⊤

i x

  • gi(x)

+vi , i ∈ {1, 2, . . . , n}

(a) Phase retrieval: yi = 1

2x∗(ziz∗ i )x+vi

(b) Localization: yi = 1

2 hi − x2 2 + vi

(Figures from [Candes’15] and [Gezici’05]) 2/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-4
SLIDE 4

Locally-Optimal Observation Selection

  • Challenge: Unknown optimal estimator and error covariance matrix
  • Locally-optimal observation selection [Flaherty’06, Krause’08]:

Linearize around a guess x0 ˆ yi := yi − gi(x0) ≈ ∇gi(x0)⊤x + vi, and find an approximate covariance matrix: ˆ PS =

  • Σ−1

x

+

  • i∈S

1 σ2

i

∇gi(x0)∇gi(x0)⊤ −1

  • Observation selection task

minimize

S

Tr

  • ˆ

PS

  • s.t.

S ⊂ [n], |S| = K

3/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-5
SLIDE 5

Proposed Approach: VTB for Quadratic Models

Main Idea Exploiting Van Trees’ bound (VTB) on error covariance of weakly biased estimators

4/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-6
SLIDE 6

Proposed Approach: VTB for Quadratic Models

Main Idea Exploiting Van Trees’ bound (VTB) on error covariance of weakly biased estimators

  • A closed-form expression for VTB of quadratic models

Theorem 1 For any weakly biased estimator ˆ xS with error covariance PS it holds that PS

  • i∈S

1 σ2

i

  • ZiΣxZ⊤

i + hih⊤ i

  • + Ix

−1 = BS

4/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-7
SLIDE 7

Proposed Approach: VTB for Quadratic Models

Main Idea Exploiting Van Trees’ bound (VTB) on error covariance of weakly biased estimators

  • A closed-form expression for VTB of quadratic models

Theorem 1 For any weakly biased estimator ˆ xS with error covariance PS it holds that PS

  • i∈S

1 σ2

i

  • ZiΣxZ⊤

i + hih⊤ i

  • + Ix

−1 = BS

  • Proposed method: Find S by greedily maximizing Tr(.) scalarization
  • f BS: f A(S) := Tr(I−1

x

− BS)

4/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-8
SLIDE 8

Characterizing f A(S)

  • Submodularity: fj(S) ≥ fj(T ) for all S ⊆ T ⊂ X and j ∈ X\T
  • αf -Weak Submodularity [Zhang’16, Chamon17]: αf × fj(S) ≥ fj(T )

where αf > 1 for all S ⊆ T ⊂ X and j ∈ X\T

5/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-9
SLIDE 9

Characterizing f A(S)

  • Submodularity: fj(S) ≥ fj(T ) for all S ⊆ T ⊂ X and j ∈ X\T
  • αf -Weak Submodularity [Zhang’16, Chamon17]: αf × fj(S) ≥ fj(T )

where αf > 1 for all S ⊆ T ⊂ X and j ∈ X\T

  • Greedy maximization performance:

f (S) ≥ (1 − e− 1

αf )f (O)

5/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-10
SLIDE 10

Characterizing f A(S)

  • Submodularity: fj(S) ≥ fj(T ) for all S ⊆ T ⊂ X and j ∈ X\T
  • αf -Weak Submodularity [Zhang’16, Chamon17]: αf × fj(S) ≥ fj(T )

where αf > 1 for all S ⊆ T ⊂ X and j ∈ X\T

  • Greedy maximization performance:

f (S) ≥ (1 − e− 1

αf )f (O)

Theorem 2 f A(S) is a normalized, monotone set function with bounded αf A.

5/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-11
SLIDE 11

Characterizing f A(S)

  • Submodularity: fj(S) ≥ fj(T ) for all S ⊆ T ⊂ X and j ∈ X\T
  • αf -Weak Submodularity [Zhang’16, Chamon17]: αf × fj(S) ≥ fj(T )

where αf > 1 for all S ⊆ T ⊂ X and j ∈ X\T

  • Greedy maximization performance:

f (S) ≥ (1 − e− 1

αf )f (O)

Theorem 2 f A(S) is a normalized, monotone set function with bounded αf A.

  • Interpretation of bound on αf A as SNR condition

5/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-12
SLIDE 12

Evaluation of Theoretical Results

  • Phase retrieval problem with n = 12 observations

3 4 5 6 7 8 9 10 11 10-3 10-2 10-1 100

(c) Tightness of VTB

10-3 10-2 10-1 100 101 0.5 1 1.5 2 2.5 3 3.5 4 4.5

(d) Bound on αf A

  • Asymptotic tightness of VTB
  • Tightness of weak submodularity bound in low SNR regime

6/6

Hashemi et al.: Submodular Observation Selection for Quadratic Models

slide-13
SLIDE 13

Thank you! Submodular Observation Selection and Information Gathering for Quadratic Models Poster #167

Wed Jun 12th 06:30 PM – 09:00 PM @ Pacific Ballroom Correspondance: Abolfazl Hashemi (email: abolfazl@utexas.edu) Mahsa Ghasemi (email: mahsa.ghasemi@utexas.edu)