Robust Uncertainty Principles: Exact Signal Reconstruction from - - PowerPoint PPT Presentation

robust uncertainty principles exact signal reconstruction
SMART_READER_LITE
LIVE PREVIEW

Robust Uncertainty Principles: Exact Signal Reconstruction from - - PowerPoint PPT Presentation

1 Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information Emmanuel Cand` es, California Institute of Technology Workshop on Wavelets and Multiscale Analysis, Oberwolfach, July 2004 Collaborators


slide-1
SLIDE 1

1

Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information

Emmanuel Cand` es, California Institute of Technology Workshop on Wavelets and Multiscale Analysis, Oberwolfach, July 2004 Collaborators: Justin Romberg (Caltech), Terence Tao (UCLA)

slide-2
SLIDE 2

2

Incomplete Fourier Information

Observe Fourier samples ˆ f(ω) on a domain Ω. 22 radial lines

  • ≈ 8% coverage for 256 by 256 image
  • ≈ 4% coverage for 512 by 512 image
slide-3
SLIDE 3

3

Classical Reconstruction

Backprojection: essentially reconstruct g∗ with ˆ g∗(ω) =    ˆ f(ω) ω ∈ Ω ω ∈ Ω

Original Phantom (Logan−Shepp) 50 100 150 200 250 50 100 150 200 250 Naive Reconstruction 50 100 150 200 250 50 100 150 200 250

  • riginal

g∗

slide-4
SLIDE 4

4

Interpolation?

50 100 150 200 250 300 −25 −20 −15 −10 −5 5 10 15 20 25 A Row of the Fourier Matrix

  • riginal

g∗ Undersample Nyquist by 25 or 50 at high frequencies!

slide-5
SLIDE 5

5

Total Variation Reconstruction

Reconstruct g∗ with min

g

gT V s.t. ˆ g(ω) = ˆ f(ω), ω ∈ Ω

Original Phantom (Logan−Shepp) 50 100 150 200 250 50 100 150 200 250 Reconstruction: min BV + nonnegativity constraint 50 100 150 200 250 50 100 150 200 250

  • riginal

g∗ = original — perfect reconstruction!

slide-6
SLIDE 6

6

Sparse Spike Train

Sparse sequence of NT spikes Observe NΩ Fourier coefficients

20 40 60 80 100 120 140 −5 −4 −3 −2 −1 1 2 3 4 5

slide-7
SLIDE 7

7

Interpolation?

slide-8
SLIDE 8

8

ℓ1 Reconstruction

Reconstruct by solving min

g

  • t

|gt| s.t. ˆ g(ω) = ˆ f(ω), ω ∈ Ω For NT ∼ NΩ/2, we recover f perfectly.

  • riginal

recovered from 30 Fourier samples

slide-9
SLIDE 9

9

Extension to TV

gT V =

  • i

|gt+1 − gt| = ℓ1-norm of finite differences Given frequency observations on Ω, using min gT V s.t. ˆ g(ω) = ˆ f(ω), ω ∈ Ω we can perfectly reconstruct signals with a small number of jumps.

slide-10
SLIDE 10

10

Reconstructed perfectly from 30 Fourier samples

slide-11
SLIDE 11

11

Agenda

  • Model problem
  • Exact reconstruction from vastly undersampled data
  • Robust uncertainty principles
  • Stability
  • Sparsity and Incoherence
  • Finding optimally sparse representations
  • Numerical Experiments
slide-12
SLIDE 12

12

Model Problem

  • Signal made out of |T | spikes
  • Observed at only |Ω| frequency locations
  • Extensions

– Piecewise constant signal – Spikes in higher-dimensions; 2D, 3D, etc. – Piecewise constant images – Many others

slide-13
SLIDE 13

13

Sharp Uncertainty Principles

  • Signal is sparse in time, only |T | spikes
  • Solve combinatorial optimization problem

(P0) min

g

gℓ0 := #{t, g(t) = 0}, ˆ g|Ω = ˆ f|Ω Theorem 1 N (sample size) is prime (i) Assume that |T | ≤ |Ω|/2, then (P0) reconstructs exactly. (ii) Assume that |T | > |Ω|/2, then (P0) fails at exactly reconstructing f; ∃f1, f2 with f1ℓ0 + f2ℓ0 = |Ω| + 1 and ˆ f1(ω) = ˆ f2(ω), ∀ω ∈ Ω

slide-14
SLIDE 14

14

ℓ1 Relaxation?

Solve convex optimization problem (LP for real-valued signals) (P1) min

g

gℓ1 :=

  • t

|g(t)|, ˆ g|Ω = ˆ f|Ω

  • Example: Dirac’s comb

√ N equispaced spikes (N perfect square).

– Invariant through Fourier transform ˆ

f = f

– Can find |Ω| = N −

√ N with ˆ f(ω) = 0, ∀ω ∈ Ω.

– Can’t reconstruct

  • More dramatic examples exist
  • But all these examples are very special
slide-15
SLIDE 15

15

Dirac’s Comb

t f(t) N ω f(ω) N

f ˆ f

slide-16
SLIDE 16

16

Main Result

Theorem 2 Suppose |T | ≤ α(M) · |Ω| log N Then min-ℓ1 reconstructs exactly with prob. greater than 1 − O(N −M). (n.b.

  • ne can choose α(M) ∼ [29.6(M + 1)]−1.

Extensions

  • |T |, number of jump discontinuities (TV reconstruction)
  • |T |, number of 2D, 3D spikes.
  • |T |, number of 2D jump discontinuities (2D TV reconstruction)
slide-17
SLIDE 17

17

Heuristics: Robust Uncertainty Principles

f unique minimizer of (P1) iff

  • t

|f(t) + h(t)| >

  • t

|f(t)|, ∀h, ˆ h|Ω = 0 Triangle inequality

  • |f(t)+h(t)| =
  • T

|f(t)+h(t)|+

  • T c

|ht| ≥

  • T

|f(t)|−|h(t)|+

  • T c

|ht| Sufficient condition

  • T

|h(t)| ≤

  • T c

|h(t)| ⇔

  • T

|h(t)| ≤ 1 2hℓ1 Conclusion: f unique minimizer if for all h, s.t. ˆ h|Ω = 0, it is impossible to ‘concentrate’ h on T

slide-18
SLIDE 18

18

Connections

  • Donoho & Stark (88)
  • Donoho & Huo (01)
  • Gribonval & Nielsen (03)
  • Tropp (03) and (04)
  • Donoho & Elad (03)
  • Gilbert et al. (04)
  • Santosa & Symes (86)
  • Dobson & Santosa (96)
  • Bresler & Feng (96)
  • Vetterli et. al. (03)
slide-19
SLIDE 19

19

Dual Viewpoint

  • Convex problem has a dual
  • Dual polynomial

P (t) =

  • ω∈Ω

ˆ P (ω)eiωt

– P (t) = sgn(f)(t) = f(t)/|f(t)|, ∀t ∈ T – |P (t)| < 1, ∀t ∈ T c – ˆ

P supported on set Ω of visible frequencies Theorem 3 (Strong Duality) (i) If FT →Ω and there exists a dual polynomial, then the (P1) minimizer is unique and is equal to f. (ii) Conversely, if f is the unique minimizer of (P1), then there exists a dual polynomial.

slide-20
SLIDE 20

20

Dual Polynomial

t P(t) P( ω) ω ^

Space Frequency

slide-21
SLIDE 21

21

Construction of the Dual Polynomial

P (t) =

  • ω∈Ω

ˆ P (ω)eiωt

  • P interpolates sgn(f) on T
  • P has minimum energy
slide-22
SLIDE 22

22

Auxiliary matrices:

1 |Ω|H = I − F−1PΩFR∗ T

Hf(t) := −

  • ω∈Ω
  • t′∈E:t′=t

eiω(t−t′) f(t′),

  • RT is the restriction map, RT f := f|T
  • R∗

T is the obvious embedding obtained by extending by zero outside of T

  • Identity: RT R∗

T = IT .

P := (R∗

T −

1 |Ω|H)(IT − 1 |Ω|RT H)−1RT sgn(f).

  • Frequency support. P has Fourier transform supported in Ω

(R∗

T −

1 |Ω|H)g(t) = 1 |Ω|

  • ω∈Ω
  • t′∈T

eiω(t−t′)g(t′) = 1 |Ω|

  • ω∈Ω

ˆ g(ω) eiωt

  • Spatial interpolation. P obeys

RT P = (RT R∗

T − 1

|Ω|RT H)(RT R∗

T − 1

|Ω|RT H)−1RT sgn(f) = RT sgn(f), and so P agrees with sgn(f) on T .

slide-23
SLIDE 23

23

Hard Things

P := (R∗

T −

1 |Ω|H)(IT − 1 |Ω|RT H)−1RT sgn(f).

  • (IT −

1 |Ω|RT H) invertible

  • |P (t)| < 1, t /

∈ T

Interpretation

IT − 1 |Ω|RT H = [FT →Ω]∗FT →Ω i.e. invertibility means that FT →Ω = RΩFR∗

T is of full rank.

slide-24
SLIDE 24

24

Invertibility

(IT − 1 |Ω|RT H) = IT − 1 |Ω|H0, H0(t, t′) =    t = t′ −

ω∈Ω eiω(t−t′).

t = t′ Fact: |H0(t, t′)| ∼

  • |Ω|

H02 ≤ Tr(H∗

0H0) =

  • t,t′

|H0(t, t′)|2 ∼ |T |2 · |Ω| Want H0 < |Ω|, and therefore |T |2 · |Ω| = O(|Ω|2) ⇔ |T | = O(

  • |Ω|)
slide-25
SLIDE 25

25

Key Estimates

  • Want to show largest eigenvalue of H0 (self-adjoint) is less than Ω.
  • Take large powers of random matrices

Tr(H2n

0 ) = λ2n 1

+ . . . + λ2n

T

  • Key estimate: develop bounds on E[Tr(H2n

0 )] ≍ γ2nnn|T |n+1|Ω|n.

  • Key intermediate result:

H0 ≤ γ

  • log |T |
  • |T | |Ω|

with large-probability

  • A lot of combinatorics!
slide-26
SLIDE 26

26

Some Details

  • Expectation

E(Tr(H2n

0 )) =

  • t1,...,t2n: tj=tj+1

E

 

  • ω1,...,ω2n∈Ω

ei P2n

j=1 ωj(tj−tj+1)

  .

  • Truncated Neumann series

(I − H0)−1 = (I − H2n

0 )(I + H0 + . . . + H2n−1

) Need to show that a random sum is less than 1.

slide-27
SLIDE 27

27

Robust Uncertainty Principles, I

  • T = supp(f)
  • Ω = supp( ˆ

f) Discrete uncertainty principle (Donoho & Stark 88) |T | + |Ω| ≥ 2 √ N Robust uncertainty principle (C. & Romberg 04): for nearly all T , Ω |T | + |Ω| ≥ β · N/

  • log N
slide-28
SLIDE 28

28

Robust Uncertainty Principles, II

Stronger result (C. & Romberg 04): T and Ω obeys |T | + |Ω| = c · N/

  • log N

For nearly all T and Ω

  • Suppose that T = supp(f), then

ˆ f|Ω ≤ ˆ f/2.

  • Suppose that Ω = supp( ˆ

f), then f|T ≤ f/2.

slide-29
SLIDE 29

29

Robust Uncertainty Principles, III

Example: T = supp(f), i.e. R∗

T f = f

ˆ f|Ω2 = RΩF f2 = RΩF f, RΩF f. Set Φ = RΩF R∗

T = FT →Ω. Since R∗ T f = f

ˆ f|Ω2 = f|T , Φ∗Φf|T . But Φ∗Φ = 1 N (|Ω| · I − H0), H0 ≤

  • log T
  • |T | · |Ω|.

Hence, Φ∗Φ = 1 N (|Ω| +

  • log T
  • |T | · |Ω|) ≤ 1/2

if |T | + |Ω| = c · N/√log N.

slide-30
SLIDE 30

30

Equivalence

  • Combinatorial optimization problem

(P0) min

g

gℓ0 := #{t, g(t) = 0}, ˆ g|Ω = ˆ f|Ω

  • Convex optimization problem (LP)

gℓ1 :=

  • |g(t)|,

ˆ g|Ω = ˆ f|Ω

  • Equivalence

For |T | ∼ |Ω|/ log N, the solutions to (P0) and (P1) are unique and are the same!

slide-31
SLIDE 31

31

Numerical Results

  • Signal length N = 1024
  • Randomly place |T | spikes, observe |Ω| random frequencies
  • Measure % recovered perfectly
  • white = always recovered, black = never recovered

|Ω|

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 20 40 60 80 100 120

recovery rate

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 10 20 30 40 50 60 70 80 90 100

|T |/|Ω| |T |/|Ω|

slide-32
SLIDE 32

32

Extensions (In Progress)

Two bases Φ and Ψ

  • f admits a decomposition in Φ, f =

t αtφt

  • |T | terms in the expansion
  • Only able to observe measurements f, ψω on small (random) set Ω.
  • |Ω| measurements
  • Incoherence: M =

√ N supt,ω |φt, ψω|

– M = 1, maximally incoherent; e.g. spikes and sinusoids – M =

√ N, maximally coherent; e.g. Φ = Ψ.

slide-33
SLIDE 33

33

Exact Reconstruction (In Progress)

Solve convex problem (LP) min

  • t

|g, φt|, g, ψω = f, ψω, ω ∈ Ω Then perfect reconstruction occurs with overwhelming prob. provided |T | ≍ |Ω| M 2[log N]a

  • Can prove a = 3
  • Probably true for a = 1
  • Sharp
  • Twist: both T AND Ω need to be randomized.
slide-34
SLIDE 34

34

Key Structure

  • Fourier and spikes: need to understand the eigenvalues of the random

matrix H0(t, t′) = 1 N

  • ω∈Ω

eiω(t−t′) 1t=t′, t, t′ ∈ T.

  • General case: need to understand the eigenvalues of the random matrix

H0(t, t′) = 1 N

  • ω∈Ω

φt, ψω ψω, φt′ 1t=t′, t, t′ ∈ T.

  • Even more combinatorics
slide-35
SLIDE 35

35

Example I: Spikes and Random Bases

  • Signal made out of |T | spikes
  • Observe |Ω| coefficients in a random basis
  • Incoherence: M ≈ 2√log N
  • Perfect reconstruction occurs if

T ≍ |Ω| (log N)2

slide-36
SLIDE 36

36

Numerical Results

  • Signal length N = 256
  • Randomly place |T | spikes, observe |Ω| frequencies in random basis.
  • Measure % recovered perfectly
  • White = always recovered, black = never recovered
  • Incoherence: M = 4.3.

|T|/|W| |W| Empirical probablities of exact reconstruction: spikes and random basis 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 10 20 30 40 50 60

slide-37
SLIDE 37

37

Example II, Wavelets and Sinusoids

  • Wavelets φt(u) = 2j/2ψ(2j/2u − k), t = (j, k).

Φj = {t : |t| = j}

  • Complex exponentials ψk(u) = ei2πku/N/

√ N Ψj = {k : 2j ≤ |k| < 2j+1}

  • Mutual Blockwise Incoherence

Mj = √ N · sup

Φj,Ψj

|φt, ψω|.

  • Strategy: Blockwise ℓ1-optimization

min

  • Φj

|g, φt|, g, ψk = f, ψk, k ∈ Ω ∩ Ψj Behaves well if |Tj| = O(|Ωj|/[M 2

j log N]),

∀j See Donoho and Huo, 2001.

slide-38
SLIDE 38

38

Reconstruction of Piecewise Polynomials, I

  • Randomly select 8 jump discontinuities
  • Randomly select cubic polynomial in between jumps
  • Observe 60 Fourier coefficients at random

500 1000 1500 2000 2500 −0.8 −0.6 −0.4 −0.2 0.2 0.4 0.6 t S(t) Exact Reconstruction of a random piecewise cubic polynomial from 60 Fourier samples Original signal Reconstructed signal

Reconstructed signal

slide-39
SLIDE 39

39

Reconstruction of Piecewise Polynomials, II

500 1000 1500 2000 2500 −0.8 −0.6 −0.4 −0.2 0.2 0.4 0.6 t S(t) Exact Reconstruction of a random piecewise cubic polynomial from 60 Fourier samples Original signal Reconstructed signal 500 1000 1500 2000 2500 −1 −0.8 −0.6 −0.4 −0.2 0.2 0.4 0.6 0.8 1 t S(t) Exact Reconstruction of a random piecewise cubic polynomial from 60 Fourier samples Original signal Reconstructed signal

Reconstructed signal Reconstructed signal 60 Fourier coefficients only!

slide-40
SLIDE 40

40

Finding Optimally Sparse Representations

Signal S is a superposition of spikes and sinusoids f(s) =

  • T

αt∈T δ(s − t) +

  • ω∈Ω

α∅eiωs.

  • T location of spikes
  • Ω location of sinusoids

Many ways to decompose f as a superposition of spikes and sinusoids. Search for the sparsest decomposition α =  αt αω  

slide-41
SLIDE 41

41

  • Dictionary Φ: columns are spikes and sinusoids

Φα = f ⇔

  • I

F ∗

αt αω   = f

  • Sparsest decomposition: solution to

(P0) min αℓ0, s.t. Φα = f combinatorial problem: intractable!

  • Basis pursuit (Chen, Donoho, Saunders)

(P1) min αℓ1, s.t. Φα = f

  • Are the solutions to (P0) and (P1) the same?
slide-42
SLIDE 42

42

Previous Work

  • Donoho and Huo (01), (P0) and (P1) have the same solution if

|T | + |Ω| ≤ .5 · √ N

  • Bruckstein and Elad (02), (P0) and (P1) have the same solution if

|T | + |Ω| ≤ .914 · √ N

slide-43
SLIDE 43

43

Robust UP’s and Optimally Sparse Decompositions

Construct a generic sparse signal f = Φα.

  • Select T at random
  • Select Ω at random
  • Select random amplitudes (αt, αω)

Theorem 4 (C. and Romberg, 2004) With overwhelming probability, the solution to (P0) and (P1) are unique and identical provided |T | + |Ω| ∼ N/ log N.

  • Consequence of robust uncertainty principles
  • Might hold with √log N, instead.
slide-44
SLIDE 44

44

Empirical Results

slide-45
SLIDE 45

45

20 40 60 80 100 120 140 10 20 30 40 50 60 70 80 90 100 # Spikes + # Sinusoids Probability Probability of finding optimal representation, N = 128 50 100 150 200 250 10 20 30 40 50 60 70 80 90 100 # Spikes + # Sinusoids Probability Probability of finding optimal representation, N = 256 50 100 150 200 250 300 350 400 450 500 10 20 30 40 50 60 70 80 90 100 # Spikes + # Sinusoids Probability Probability of finding optimal representation, N = 512

slide-46
SLIDE 46

46

Other Phantoms, I

Original Phantom 50 100 150 200 250 50 100 150 200 250 Classical Reconstruction 50 100 150 200 250 50 100 150 200 250

  • riginal

g∗ = classical reconstruction

slide-47
SLIDE 47

47

Original Phantom 50 100 150 200 250 50 100 150 200 250 Total Variation Reconstruction 50 100 150 200 250 50 100 150 200 250

  • riginal

g∗ = TV reconstruction = Exact!

slide-48
SLIDE 48

48

Other Phantoms, II

Original Phantom 50 100 150 200 250 50 100 150 200 250 Classical Reconstruction 50 100 150 200 250 50 100 150 200 250

  • riginal

g∗ = classical reconstruction

slide-49
SLIDE 49

49

Original Phantom 50 100 150 200 250 50 100 150 200 250 Total Variation Reconstruction 50 100 150 200 250 50 100 150 200 250

  • riginal

g∗ = TV reconstruction = Exact!

slide-50
SLIDE 50

50

Scanlines

50 100 150 200 250 300 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 A Scanline of the Original Phantom 50 100 150 200 250 300 0.2 0.4 0.6 0.8 1 1.2 1.4 1.6 1.8 2 Classical (Black) and TV (Red) Reconstructions

  • riginal

g∗ = classical reconstruction

slide-51
SLIDE 51

51

Summary

  • Exact reconstruction
  • Tied to new uncertainty principles
  • Stability
  • Optimality
  • Many extensions, applications

Contact: emmanuel@acm.caltech.edu