Bounds on Sparse Recovery with Additional Structures Abbas - - PowerPoint PPT Presentation

bounds on sparse recovery with additional structures
SMART_READER_LITE
LIVE PREVIEW

Bounds on Sparse Recovery with Additional Structures Abbas - - PowerPoint PPT Presentation

Bounds on Sparse Recovery with Additional Structures Abbas Kazemipour University of Maryland. College Park kaazemi@umd.edu March 23, 2015 Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 1 / 19 Overview 1 Restricted


slide-1
SLIDE 1

Bounds on Sparse Recovery with Additional Structures

Abbas Kazemipour

University of Maryland. College Park kaazemi@umd.edu

March 23, 2015

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 1 / 19

slide-2
SLIDE 2

Overview

1 Restricted Isometry Property

Introduction: RIP revisited Proof of the RIP

2 RIP with Side Information

Motivation Formulation: Sequences of Signals with Sparse Increments

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 2 / 19

slide-3
SLIDE 3

Motivation

1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments

Total Variation, Differential ℓ1-minimization: Application to ENF

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

slide-4
SLIDE 4

Motivation

1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments

Total Variation, Differential ℓ1-minimization: Application to ENF

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

slide-5
SLIDE 5

Motivation

1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments

Total Variation, Differential ℓ1-minimization: Application to ENF

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

slide-6
SLIDE 6

Motivation

1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments

Total Variation, Differential ℓ1-minimization: Application to ENF

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

slide-7
SLIDE 7

Motivation

1 How many samples are necessary? 2 Will discuss the sufficiency today. 3 Information theoretic arguments needed for converse. 4 What if we have more structure on the sparsity? 5 Example: Sequences of Signals with Sparse Increments

Total Variation, Differential ℓ1-minimization: Application to ENF

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 3 / 19

slide-8
SLIDE 8

Restricted Isometry Property (RIP)

1

y = Ax (1)

RIP-s

A is said to satisfy the RIP of order s with constant δs if (1 − δs)x2

2≤ Ax2 2≤ (1 + δs)x2 2,

(2) for every x being s-sparse.

2 Without loss of generality we may assume x2

2= 1

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 4 / 19

slide-9
SLIDE 9

Restricted Isometry Property (RIP)

1

y = Ax (1)

RIP-s

A is said to satisfy the RIP of order s with constant δs if (1 − δs)x2

2≤ Ax2 2≤ (1 + δs)x2 2,

(2) for every x being s-sparse.

2 Without loss of generality we may assume x2

2= 1

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 4 / 19

slide-10
SLIDE 10

Sub-Gaussian Random Matrix

1 If the entries of A are independent mean-zero subgaussian random

variables, i.e. for t > 0, P(|Aj,k|≥ t) ≤ βe−κt2 (3)

2 Example: Bernoulli rv, Gaussian rv etc. Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 5 / 19

slide-11
SLIDE 11

Sub-Gaussian Random Matrix

1 If the entries of A are independent mean-zero subgaussian random

variables, i.e. for t > 0, P(|Aj,k|≥ t) ≤ βe−κt2 (3)

2 Example: Bernoulli rv, Gaussian rv etc. Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 5 / 19

slide-12
SLIDE 12

Main Theorem

1 Let elements of A ∈ Rm×N have normalized variance, then

Sufficient Number of Measurements for Sparse Recovery

There exists C > 0 such that,

1 √mA satisfies RIP-s with δs ≤ δ with

probability at least 1 − ǫ provided m ≥ C δ2

  • s log(eN

s ) + log(2 ǫ )

  • (4)

Setting ǫ = 2 exp(− δ2m

2C ) gives

m ≥ 2C δ2 s log(eN s ) (5)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 6 / 19

slide-13
SLIDE 13

Proof of Main Theorem

1 Step 1:

Concentration Inequality for Subgaussian Random Matrices

For all x and t ∈ (0, 1) P

  • 1

mAx2

2−x2 2

  • ≥ tx2

2

  • ≤ 2 exp(−2cmt2),

(6) for some constant c.

2 Note: By assumptions

E 1 mAx2

2

  • = x2

2.

(7)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 7 / 19

slide-14
SLIDE 14

Proof of Main Theorem

1 Step 1:

Concentration Inequality for Subgaussian Random Matrices

For all x and t ∈ (0, 1) P

  • 1

mAx2

2−x2 2

  • ≥ tx2

2

  • ≤ 2 exp(−2cmt2),

(6) for some constant c.

2 Note: By assumptions

E 1 mAx2

2

  • = x2

2.

(7)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 7 / 19

slide-15
SLIDE 15

Proof of Main Theorem

1 Let S ⊂ {1, 2, · · · , N} with |S|= s and

BS = {x : supp(x) ⊂ S, x2= 1}.

2 Step 2:

Covering the Unit Sphere

Let ρ ∈ (0, 1/2). There exists a finite subset U of BS satisfying |U|≤

  • 1 + 2

ρ s , (8) and min

u∈Ux − u2≤ ρ,

(9) for all x ∈ BS.

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 8 / 19

slide-16
SLIDE 16

Proof of Main Theorem

1 Let S ⊂ {1, 2, · · · , N} with |S|= s and

BS = {x : supp(x) ⊂ S, x2= 1}.

2 Step 2:

Covering the Unit Sphere

Let ρ ∈ (0, 1/2). There exists a finite subset U of BS satisfying |U|≤

  • 1 + 2

ρ s , (8) and min

u∈Ux − u2≤ ρ,

(9) for all x ∈ BS.

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 8 / 19

slide-17
SLIDE 17

Proof of Main Theorem

Figure: Illustration of covering for different ρ’s

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 9 / 19

slide-18
SLIDE 18

Proof of Main Theorem

1 Combining steps 1 and 2:

P

  • Au2

2−u2 2

  • ≥ tu2

2,

for some u ∈ U

  • u∈U

P

  • Au2

2−u2 2

  • ≥ tu2

2

  • ≤ 2|U|exp(−cmt2)

≤ 2

  • 1 + 2

ρ s exp(−cmt2)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 10 / 19

slide-19
SLIDE 19

Proof of Main Theorem

1 Goal: Restriction on eigenvalues of B = AH

S AS − I.

2 Step 3:

Bounding the Eigenvalues

B2→2≤ t 1 − 2ρ, (10) with probability at least 1 − 2

  • 1 + 2

ρ s exp(−cmt2)

3 Proof:

|Bx, x|= |Bu, u + B(x + u), B(x − u)| ≤ |Bu, u|+|B(x + u), B(x − u)| < t + B2→2x + u2x − u2≤ t + 2ρB2→2, 2

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 11 / 19

slide-20
SLIDE 20

Proof of Main Theorem

1 Goal: Restriction on eigenvalues of B = AH

S AS − I.

2 Step 3:

Bounding the Eigenvalues

B2→2≤ t 1 − 2ρ, (10) with probability at least 1 − 2

  • 1 + 2

ρ s exp(−cmt2)

3 Proof:

|Bx, x|= |Bu, u + B(x + u), B(x − u)| ≤ |Bu, u|+|B(x + u), B(x − u)| < t + B2→2x + u2x − u2≤ t + 2ρB2→2, 2

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 11 / 19

slide-21
SLIDE 21

Proof of Main Theorem

1 Goal: Restriction on eigenvalues of B = AH

S AS − I.

2 Step 3:

Bounding the Eigenvalues

B2→2≤ t 1 − 2ρ, (10) with probability at least 1 − 2

  • 1 + 2

ρ s exp(−cmt2)

3 Proof:

|Bx, x|= |Bu, u + B(x + u), B(x − u)| ≤ |Bu, u|+|B(x + u), B(x − u)| < t + B2→2x + u2x − u2≤ t + 2ρB2→2, 2

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 11 / 19

slide-22
SLIDE 22

Proof of Main Theorem

1 Set t = (1 − 2ρ)δ < 1 so that B2→2< δ.

P

  • AH

S AS − I2→2≥ δ

  • ≤ 2
  • 1 + 2

ρ s exp(−c(1 − 2ρ)2δ2t2) (11)

2 Step 4:

Extending to an arbitrary support set S:

P (δs ≥ δ) ≤

  • S:|S|=s

P

  • AH

S AS − I2→2≥ δ

  • ≤ 2

N s 1 + 2 ρ s exp(−c(1 − 2ρ)2δ2t2) ≤ 2 eN s s 1 + 2 ρ s exp(−c(1 − 2ρ)2δ2t2)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 12 / 19

slide-23
SLIDE 23

Proof of Main Theorem

1 Set t = (1 − 2ρ)δ < 1 so that B2→2< δ.

P

  • AH

S AS − I2→2≥ δ

  • ≤ 2
  • 1 + 2

ρ s exp(−c(1 − 2ρ)2δ2t2) (11)

2 Step 4:

Extending to an arbitrary support set S:

P (δs ≥ δ) ≤

  • S:|S|=s

P

  • AH

S AS − I2→2≥ δ

  • ≤ 2

N s 1 + 2 ρ s exp(−c(1 − 2ρ)2δ2t2) ≤ 2 eN s s 1 + 2 ρ s exp(−c(1 − 2ρ)2δ2t2)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 12 / 19

slide-24
SLIDE 24

Proof of Main Theorem

1 Set ρ = 2/(e7/2 − 1) and

ǫ = 2 eN

s

s 1 + 2

ρ

s exp(−c(1 − 2ρ)2δ2t2) to find m!

Main Theorem Revisited

δs < δ with probability at least 1 − ǫ if m > 1 cδ2 4 3s log(eN s ) + 14s + 4 3 log(2 ǫ )

  • .

(12)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 13 / 19

slide-25
SLIDE 25

Motivation

1 Consider the case of linear measurement y = Ax + n with x being

s-sparse.

2 Some side information about the locations of the non-zero entries 3 Are the bounds by the RIP optimal? Answer: Not in general 4 Examples of such additional:

Streaming of Video Frames ENF

5 In general: Sequences of Signals with Sparse Increments

Slowly varying signals

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 14 / 19

slide-26
SLIDE 26

Motivation

1 Consider the case of linear measurement y = Ax + n with x being

s-sparse.

2 Some side information about the locations of the non-zero entries 3 Are the bounds by the RIP optimal? Answer: Not in general 4 Examples of such additional:

Streaming of Video Frames ENF

5 In general: Sequences of Signals with Sparse Increments

Slowly varying signals

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 14 / 19

slide-27
SLIDE 27

Motivation

1 Consider the case of linear measurement y = Ax + n with x being

s-sparse.

2 Some side information about the locations of the non-zero entries 3 Are the bounds by the RIP optimal? Answer: Not in general 4 Examples of such additional:

Streaming of Video Frames ENF

5 In general: Sequences of Signals with Sparse Increments

Slowly varying signals

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 14 / 19

slide-28
SLIDE 28

Motivation

1 Consider the case of linear measurement y = Ax + n with x being

s-sparse.

2 Some side information about the locations of the non-zero entries 3 Are the bounds by the RIP optimal? Answer: Not in general 4 Examples of such additional:

Streaming of Video Frames ENF

5 In general: Sequences of Signals with Sparse Increments

Slowly varying signals

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 14 / 19

slide-29
SLIDE 29

Motivation

1 Consider the case of linear measurement y = Ax + n with x being

s-sparse.

2 Some side information about the locations of the non-zero entries 3 Are the bounds by the RIP optimal? Answer: Not in general 4 Examples of such additional:

Streaming of Video Frames ENF

5 In general: Sequences of Signals with Sparse Increments

Slowly varying signals

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 14 / 19

slide-30
SLIDE 30

Motivation

1 Consider the case of linear measurement y = Ax + n with x being

s-sparse.

2 Some side information about the locations of the non-zero entries 3 Are the bounds by the RIP optimal? Answer: Not in general 4 Examples of such additional:

Streaming of Video Frames ENF

5 In general: Sequences of Signals with Sparse Increments

Slowly varying signals

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 14 / 19

slide-31
SLIDE 31

Motivation

1 Consider the case of linear measurement y = Ax + n with x being

s-sparse.

2 Some side information about the locations of the non-zero entries 3 Are the bounds by the RIP optimal? Answer: Not in general 4 Examples of such additional:

Streaming of Video Frames ENF

5 In general: Sequences of Signals with Sparse Increments

Slowly varying signals

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 14 / 19

slide-32
SLIDE 32

Sequences of Signals with Sparse Increments

1 Our underlying model:

yk = Akxk + nk, (13) with xk ∈ RN, Ak ∈ Rm×N and yk, nk ∈ Rm for k = 1, 2, · · · , K

2 xk not necessarily sparse 3 Increments xk − xk−1 are sk-sparse 4 s∗ =

k sk

5 For simplicity assume sk = s, thus s∗ = Ks Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 15 / 19

slide-33
SLIDE 33

Sequences of Signals with Sparse Increments

1 Our underlying model:

yk = Akxk + nk, (13) with xk ∈ RN, Ak ∈ Rm×N and yk, nk ∈ Rm for k = 1, 2, · · · , K

2 xk not necessarily sparse 3 Increments xk − xk−1 are sk-sparse 4 s∗ =

k sk

5 For simplicity assume sk = s, thus s∗ = Ks Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 15 / 19

slide-34
SLIDE 34

Sequences of Signals with Sparse Increments

1 Our underlying model:

yk = Akxk + nk, (13) with xk ∈ RN, Ak ∈ Rm×N and yk, nk ∈ Rm for k = 1, 2, · · · , K

2 xk not necessarily sparse 3 Increments xk − xk−1 are sk-sparse 4 s∗ =

k sk

5 For simplicity assume sk = s, thus s∗ = Ks Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 15 / 19

slide-35
SLIDE 35

Sequences of Signals with Sparse Increments

1 Our underlying model:

yk = Akxk + nk, (13) with xk ∈ RN, Ak ∈ Rm×N and yk, nk ∈ Rm for k = 1, 2, · · · , K

2 xk not necessarily sparse 3 Increments xk − xk−1 are sk-sparse 4 s∗ =

k sk

5 For simplicity assume sk = s, thus s∗ = Ks Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 15 / 19

slide-36
SLIDE 36

Sequences of Signals with Sparse Increments

1 Our underlying model:

yk = Akxk + nk, (13) with xk ∈ RN, Ak ∈ Rm×N and yk, nk ∈ Rm for k = 1, 2, · · · , K

2 xk not necessarily sparse 3 Increments xk − xk−1 are sk-sparse 4 s∗ =

k sk

5 For simplicity assume sk = s, thus s∗ = Ks Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 15 / 19

slide-37
SLIDE 37

Sequences of Signals with Sparse Increments: Equivalent Formulation

1 Initial Model

yk = Akxk + nk, (14)

2 Equivalent Formulation:

˜ y = ˜ A˜ x + ˜ n, (15) where ˜ y = [y′

1, y′ 2, · · · , y′ K]′

˜ A = [A′

1, A′ 2, · · · , A′ k]′

˜ x = [x′

1, x′ 2, · · · , x′ k]′

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 16 / 19

slide-38
SLIDE 38

Sequences of Signals with Sparse Increments: Equivalent Formulation

1 Initial Model

yk = Akxk + nk, (14)

2 Equivalent Formulation:

˜ y = ˜ A˜ x + ˜ n, (15) where ˜ y = [y′

1, y′ 2, · · · , y′ K]′

˜ A = [A′

1, A′ 2, · · · , A′ k]′

˜ x = [x′

1, x′ 2, · · · , x′ k]′

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 16 / 19

slide-39
SLIDE 39

Sequences of Signals with Sparse Increments: Equivalent Formulation

1 Better Model

˜ y = ˜ A˜ x + ˜ n, (16)

2 Not capturing sparsity yet

˜ y = ˜ AB x + ˜ n, (17) B =          1 · · · −1 1 . . . −1 1 ... . . . ... ... ... · · · −1 1          (18)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 17 / 19

slide-40
SLIDE 40

Sequences of Signals with Sparse Increments: Equivalent Formulation

1 Better Model

˜ y = ˜ A˜ x + ˜ n, (16)

2 Not capturing sparsity yet

˜ y = ˜ AB x + ˜ n, (17) B =          1 · · · −1 1 . . . −1 1 ... . . . ... ... ... · · · −1 1          (18)

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 17 / 19

slide-41
SLIDE 41

References

Wainwright, Martin J. “Sharp thresholds for high-dimensional and noisy sparsity recovery using-constrained quadratic programming (Lasso).” Information Theory, IEEE Transactions on 55.5 (2009), 2183 – 2202.

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 18 / 19

slide-42
SLIDE 42

Questions !

Abbas Kazemipour (UMD) Sparse Recovery with Side Info. March 23, 2015 19 / 19