submodular observation selection and information
play

Submodular Observation Selection and Information Gathering for - PowerPoint PPT Presentation

Submodular Observation Selection and Information Gathering for Quadratic Models Abolfazl Hashemi , Mahsa Ghasemi, Haris Vikalo, and Ufuk Topcu ICML, Wednesday June 12, 2019 Observation Selection and Information Gathering


  1. Submodular Observation Selection and Information Gathering for Quadratic Models Abolfazl Hashemi ∗ , Mahsa Ghasemi, Haris Vikalo, and Ufuk Topcu ICML, Wednesday June 12, 2019

  2. Observation Selection and Information Gathering • Resource-constrained inference problems ◦ Target tracking, experimental design, sensor networks • Access to expensive / limited observations ◦ Communication cost, power consumption, computational burden Goal Cost-effectively identify the most useful subset of information 1/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  3. Observation Selection for Quadratic Models • Quadratic relation between observations and unknown parameters y i = 1 2 x ⊤ Z i x + h ⊤ i x + v i , i ∈ { 1 , 2 , . . . , n } � �� � g i ( x ) (a) Phase retrieval: y i = 1 (b) Localization: y i = 1 2 � h i − x � 2 2 x ∗ ( z i z ∗ i ) x + v i 2 + v i (Figures from [Candes’15] and [Gezici’05]) 2/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  4. Locally-Optimal Observation Selection • Challenge: Unknown optimal estimator and error covariance matrix • Locally-optimal observation selection [Flaherty’06, Krause’08]: Linearize around a guess x 0 y i := y i − g i ( x 0 ) ≈ ∇ g i ( x 0 ) ⊤ x + v i , ˆ and find an approximate covariance matrix: � � − 1 1 � ˆ Σ − 1 ∇ g i ( x 0 ) ∇ g i ( x 0 ) ⊤ P S = + x σ 2 i i ∈S • Observation selection task � � ˆ minimize Tr P S S s.t. S ⊂ [ n ] , |S| = K 3/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  5. Proposed Approach: VTB for Quadratic Models Main Idea Exploiting Van Trees’ bound (VTB) on error covariance of weakly biased estimators 4/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  6. Proposed Approach: VTB for Quadratic Models Main Idea Exploiting Van Trees’ bound (VTB) on error covariance of weakly biased estimators • A closed-form expression for VTB of quadratic models Theorem 1 For any weakly biased estimator ˆ x S with error covariance P S it holds that �� � − 1 1 � � Z i Σ x Z ⊤ i + h i h ⊤ P S � + I x = B S σ 2 i i i ∈S 4/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  7. Proposed Approach: VTB for Quadratic Models Main Idea Exploiting Van Trees’ bound (VTB) on error covariance of weakly biased estimators • A closed-form expression for VTB of quadratic models Theorem 1 For any weakly biased estimator ˆ x S with error covariance P S it holds that �� � − 1 1 � � Z i Σ x Z ⊤ i + h i h ⊤ P S � + I x = B S σ 2 i i i ∈S • Proposed method: Find S by greedily maximizing Tr(.) scalarization of B S : f A ( S ) := Tr ( I − 1 − B S ) x 4/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  8. Characterizing f A ( S ) • Submodularity: f j ( S ) ≥ f j ( T ) for all S ⊆ T ⊂ X and j ∈ X\T • α f -Weak Submodularity [Zhang’16, Chamon17]: α f × f j ( S ) ≥ f j ( T ) where α f > 1 for all S ⊆ T ⊂ X and j ∈ X\T 5/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  9. Characterizing f A ( S ) • Submodularity: f j ( S ) ≥ f j ( T ) for all S ⊆ T ⊂ X and j ∈ X\T • α f -Weak Submodularity [Zhang’16, Chamon17]: α f × f j ( S ) ≥ f j ( T ) where α f > 1 for all S ⊆ T ⊂ X and j ∈ X\T • Greedy maximization performance: f ( S ) ≥ ( 1 − e − 1 α f ) f ( O ) 5/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  10. Characterizing f A ( S ) • Submodularity: f j ( S ) ≥ f j ( T ) for all S ⊆ T ⊂ X and j ∈ X\T • α f -Weak Submodularity [Zhang’16, Chamon17]: α f × f j ( S ) ≥ f j ( T ) where α f > 1 for all S ⊆ T ⊂ X and j ∈ X\T • Greedy maximization performance: f ( S ) ≥ ( 1 − e − 1 α f ) f ( O ) Theorem 2 f A ( S ) is a normalized, monotone set function with bounded α f A . 5/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  11. Characterizing f A ( S ) • Submodularity: f j ( S ) ≥ f j ( T ) for all S ⊆ T ⊂ X and j ∈ X\T • α f -Weak Submodularity [Zhang’16, Chamon17]: α f × f j ( S ) ≥ f j ( T ) where α f > 1 for all S ⊆ T ⊂ X and j ∈ X\T • Greedy maximization performance: f ( S ) ≥ ( 1 − e − 1 α f ) f ( O ) Theorem 2 f A ( S ) is a normalized, monotone set function with bounded α f A . • Interpretation of bound on α f A as SNR condition 5/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  12. Evaluation of Theoretical Results • Phase retrieval problem with n = 12 observations 4.5 10 0 4 3.5 3 10 -1 2.5 2 10 -2 1.5 1 10 -3 0.5 10 -3 10 -2 10 -1 10 0 10 1 3 4 5 6 7 8 9 10 11 (c) Tightness of VTB (d) Bound on α f A • Asymptotic tightness of VTB • Tightness of weak submodularity bound in low SNR regime 6/6 Hashemi et al.: Submodular Observation Selection for Quadratic Models

  13. Thank you! Submodular Observation Selection and Information Gathering for Quadratic Models Poster # 167 Wed Jun 12th 06:30 PM – 09:00 PM @ Pacific Ballroom Correspondance: Abolfazl Hashemi (email: abolfazl@utexas.edu) Mahsa Ghasemi (email: mahsa.ghasemi@utexas.edu)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend