bounds on sparse recovery with additional structures
play

Bounds on Sparse Recovery with Additional Structures Abbas - PowerPoint PPT Presentation

Bounds on Sparse Recovery with Additional Structures Abbas Kazemipour University of Maryland. College Park kaazemi@umd.edu March 23, 2015 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 1 / 19 Overview 1 Restricted


  1. Bounds on Sparse Recovery with Additional Structures Abbas Kazemipour University of Maryland. College Park kaazemi@umd.edu March 23, 2015 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 1 / 19

  2. Overview 1 Restricted Isometry Property Introduction: RIP revisited Proof of the RIP 2 RIP with Side Information Motivation Formulation: Sequences of Signals with Sparse Increments Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 2 / 19

  3. Motivation 1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments Total Variation, Differential ℓ 1 -minimization: Application to ENF Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

  4. Motivation 1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments Total Variation, Differential ℓ 1 -minimization: Application to ENF Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

  5. Motivation 1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments Total Variation, Differential ℓ 1 -minimization: Application to ENF Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

  6. Motivation 1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments Total Variation, Differential ℓ 1 -minimization: Application to ENF Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

  7. Motivation 1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments Total Variation, Differential ℓ 1 -minimization: Application to ENF Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

  8. Restricted Isometry Property (RIP) 1 y = Ax (1) RIP- s A is said to satisfy the RIP of order s with constant δ s if (1 − δ s ) � x � 2 2 ≤ � Ax � 2 2 ≤ (1 + δ s ) � x � 2 2 , (2) for every x being s -sparse. 2 Without loss of generality we may assume � x � 2 2 = 1 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 4 / 19

  9. Restricted Isometry Property (RIP) 1 y = Ax (1) RIP- s A is said to satisfy the RIP of order s with constant δ s if (1 − δ s ) � x � 2 2 ≤ � Ax � 2 2 ≤ (1 + δ s ) � x � 2 2 , (2) for every x being s -sparse. 2 Without loss of generality we may assume � x � 2 2 = 1 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 4 / 19

  10. Sub-Gaussian Random Matrix 1 If the entries of A are independent mean-zero subgaussian random variables, i.e. for t > 0, P ( | A j,k |≥ t ) ≤ βe − κt 2 (3) 2 Example: Bernoulli rv, Gaussian rv etc. Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 5 / 19

  11. Sub-Gaussian Random Matrix 1 If the entries of A are independent mean-zero subgaussian random variables, i.e. for t > 0, P ( | A j,k |≥ t ) ≤ βe − κt 2 (3) 2 Example: Bernoulli rv, Gaussian rv etc. Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 5 / 19

  12. Main Theorem 1 Let elements of A ∈ R m × N have normalized variance, then Sufficient Number of Measurements for Sparse Recovery 1 There exists C > 0 such that, √ m A satisfies RIP- s with δ s ≤ δ with probability at least 1 − ǫ provided � � m ≥ C s log( eN s ) + log(2 ǫ ) (4) δ 2 Setting ǫ = 2 exp( − δ 2 m 2 C ) gives m ≥ 2 C δ 2 s log( eN s ) (5) Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 6 / 19

  13. Proof of Main Theorem 1 Step 1: Concentration Inequality for Subgaussian Random Matrices For all x and t ∈ (0 , 1) �� � � � � 1 � m � Ax � 2 2 −� x � 2 � � ≥ t � x � 2 ≤ 2 exp( − 2 cmt 2 ) , (6) P � 2 2 for some constant c . 2 Note: By assumptions � 1 � m � Ax � 2 = � x � 2 2 . (7) E 2 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 7 / 19

  14. Proof of Main Theorem 1 Step 1: Concentration Inequality for Subgaussian Random Matrices For all x and t ∈ (0 , 1) �� � � � � 1 � m � Ax � 2 2 −� x � 2 � � ≥ t � x � 2 ≤ 2 exp( − 2 cmt 2 ) , (6) P � 2 2 for some constant c . 2 Note: By assumptions � 1 � m � Ax � 2 = � x � 2 2 . (7) E 2 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 7 / 19

  15. Proof of Main Theorem 1 Let S ⊂ { 1 , 2 , · · · , N } with | S | = s and B S = { x : supp( x ) ⊂ S, � x � 2 = 1 } . 2 Step 2: Covering the Unit Sphere Let ρ ∈ (0 , 1 / 2). There exists a finite subset U of B S satisfying � � s 1 + 2 | U |≤ , (8) ρ and min u ∈ U � x − u � 2 ≤ ρ, (9) for all x ∈ B S . Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 8 / 19

  16. Proof of Main Theorem 1 Let S ⊂ { 1 , 2 , · · · , N } with | S | = s and B S = { x : supp( x ) ⊂ S, � x � 2 = 1 } . 2 Step 2: Covering the Unit Sphere Let ρ ∈ (0 , 1 / 2). There exists a finite subset U of B S satisfying � � s 1 + 2 | U |≤ , (8) ρ and min u ∈ U � x − u � 2 ≤ ρ, (9) for all x ∈ B S . Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 8 / 19

  17. Proof of Main Theorem Figure: Illustration of covering for different ρ ’s Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 9 / 19

  18. Proof of Main Theorem 1 Combining steps 1 and 2: �� � � � � Au � 2 2 −� u � 2 � ≥ t � u � 2 2 , for some u ∈ U P 2 � �� � � � � Au � 2 � ≥ t � u � 2 2 −� u � 2 ≤ 2 | U | exp( − cmt 2 ) ≤ P 2 2 u ∈ U � � s 1 + 2 exp( − cmt 2 ) ≤ 2 ρ Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 10 / 19

  19. Proof of Main Theorem 1 Goal: Restriction on eigenvalues of B = A H S A S − I . 2 Step 3: Bounding the Eigenvalues t � B � 2 → 2 ≤ 1 − 2 ρ, (10) with probability at least � � s 1 + 2 exp( − cmt 2 ) 1 − 2 ρ 3 Proof: |� Bx, x �| = |� Bu, u � + � B ( x + u ) , B ( x − u ) �| ≤ |� Bu, u �| + |� B ( x + u ) , B ( x − u ) �| < t + � B � 2 → 2 � x + u � 2 � x − u � 2 ≤ t + 2 ρ � B � 2 → 2 , 2 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 11 / 19

  20. Proof of Main Theorem 1 Goal: Restriction on eigenvalues of B = A H S A S − I . 2 Step 3: Bounding the Eigenvalues t � B � 2 → 2 ≤ 1 − 2 ρ, (10) with probability at least � � s 1 + 2 exp( − cmt 2 ) 1 − 2 ρ 3 Proof: |� Bx, x �| = |� Bu, u � + � B ( x + u ) , B ( x − u ) �| ≤ |� Bu, u �| + |� B ( x + u ) , B ( x − u ) �| < t + � B � 2 → 2 � x + u � 2 � x − u � 2 ≤ t + 2 ρ � B � 2 → 2 , 2 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 11 / 19

  21. Proof of Main Theorem 1 Goal: Restriction on eigenvalues of B = A H S A S − I . 2 Step 3: Bounding the Eigenvalues t � B � 2 → 2 ≤ 1 − 2 ρ, (10) with probability at least � � s 1 + 2 exp( − cmt 2 ) 1 − 2 ρ 3 Proof: |� Bx, x �| = |� Bu, u � + � B ( x + u ) , B ( x − u ) �| ≤ |� Bu, u �| + |� B ( x + u ) , B ( x − u ) �| < t + � B � 2 → 2 � x + u � 2 � x − u � 2 ≤ t + 2 ρ � B � 2 → 2 , 2 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 11 / 19

  22. Proof of Main Theorem 1 Set t = (1 − 2 ρ ) δ < 1 so that � B � 2 → 2 < δ . � � s � � 1 + 2 � A H exp( − c (1 − 2 ρ ) 2 δ 2 t 2 ) (11) P S A S − I � 2 → 2 ≥ δ ≤ 2 ρ 2 Step 4: Extending to an arbitrary support set S : � � � � A H P ( δ s ≥ δ ) ≤ S A S − I � 2 → 2 ≥ δ P S : | S | = s � N � � � s 1 + 2 exp( − c (1 − 2 ρ ) 2 δ 2 t 2 ) ≤ 2 s ρ � eN � s � � s 1 + 2 exp( − c (1 − 2 ρ ) 2 δ 2 t 2 ) ≤ 2 s ρ Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 12 / 19

  23. Proof of Main Theorem 1 Set t = (1 − 2 ρ ) δ < 1 so that � B � 2 → 2 < δ . � � s � � 1 + 2 � A H exp( − c (1 − 2 ρ ) 2 δ 2 t 2 ) (11) P S A S − I � 2 → 2 ≥ δ ≤ 2 ρ 2 Step 4: Extending to an arbitrary support set S : � � � � A H P ( δ s ≥ δ ) ≤ S A S − I � 2 → 2 ≥ δ P S : | S | = s � N � � � s 1 + 2 exp( − c (1 − 2 ρ ) 2 δ 2 t 2 ) ≤ 2 s ρ � eN � s � � s 1 + 2 exp( − c (1 − 2 ρ ) 2 δ 2 t 2 ) ≤ 2 s ρ Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 12 / 19

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend