robust uncertainty principles exact signal reconstruction
play

Robust Uncertainty Principles: Exact Signal Reconstruction from - PowerPoint PPT Presentation

1 Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information Emmanuel Cand` es, California Institute of Technology Workshop on Wavelets and Multiscale Analysis, Oberwolfach, July 2004 Collaborators


  1. 1 Robust Uncertainty Principles: Exact Signal Reconstruction from Highly Incomplete Frequency Information Emmanuel Cand` es, California Institute of Technology Workshop on Wavelets and Multiscale Analysis, Oberwolfach, July 2004 Collaborators : Justin Romberg (Caltech), Terence Tao (UCLA)

  2. 2 Incomplete Fourier Information Observe Fourier samples ˆ f ( ω ) on a domain Ω . 22 radial lines • ≈ 8% coverage for 256 by 256 image • ≈ 4% coverage for 512 by 512 image

  3. 3 Classical Reconstruction Backprojection: essentially reconstruct g ∗ with  ˆ f ( ω ) ω ∈ Ω  g ∗ ( ω ) = ˆ 0 ω �∈ Ω  Original Phantom (Logan − Shepp) Naive Reconstruction 50 50 100 100 150 150 200 200 250 250 50 100 150 200 250 50 100 150 200 250 g ∗ original

  4. 4 Interpolation? A Row of the Fourier Matrix 25 20 15 10 5 0 − 5 − 10 − 15 − 20 − 25 0 50 100 150 200 250 300 g ∗ original Undersample Nyquist by 25 or 50 at high frequencies!

  5. 5 Total Variation Reconstruction Reconstruct g ∗ with g ( ω ) = ˆ min � g � T V s.t. ˆ f ( ω ) , ω ∈ Ω g Original Phantom (Logan − Shepp) Reconstruction: min BV + nonnegativity constraint 50 50 100 100 150 150 200 200 250 250 50 100 150 200 250 50 100 150 200 250 g ∗ = original — perfect reconstruction! original

  6. 6 Sparse Spike Train Sparse sequence of N T spikes Observe N Ω Fourier coefficients 5 4 3 2 1 0 − 1 − 2 − 3 − 4 − 5 0 20 40 60 80 100 120 140

  7. 7 Interpolation?

  8. 8 ℓ 1 Reconstruction Reconstruct by solving � g ( ω ) = ˆ min | g t | ˆ f ( ω ) , ω ∈ Ω s.t. g t For N T ∼ N Ω / 2 , we recover f perfectly. original recovered from 30 Fourier samples

  9. 9 Extension to TV � � g � T V = | g t +1 − g t | = ℓ 1 -norm of finite differences i Given frequency observations on Ω , using g ( ω ) = ˆ min � g � T V s.t. ˆ f ( ω ) , ω ∈ Ω we can perfectly reconstruct signals with a small number of jumps.

  10. 10 Reconstructed perfectly from 30 Fourier samples

  11. 11 Agenda • Model problem • Exact reconstruction from vastly undersampled data • Robust uncertainty principles • Stability • Sparsity and Incoherence • Finding optimally sparse representations • Numerical Experiments

  12. 12 Model Problem • Signal made out of | T | spikes • Observed at only | Ω | frequency locations • Extensions – Piecewise constant signal – Spikes in higher-dimensions; 2 D , 3 D , etc. – Piecewise constant images – Many others

  13. 13 Sharp Uncertainty Principles • Signal is sparse in time, only | T | spikes • Solve combinatorial optimization problem g | Ω = ˆ ( P 0 ) min � g � ℓ 0 := # { t, g ( t ) � = 0 } , ˆ f | Ω g Theorem 1 N (sample size) is prime (i) Assume that | T | ≤ | Ω | / 2 , then ( P 0 ) reconstructs exactly. (ii) Assume that | T | > | Ω | / 2 , then ( P 0 ) fails at exactly reconstructing f ; ∃ f 1 , f 2 with � f 1 � ℓ 0 + � f 2 � ℓ 0 = | Ω | + 1 and f 1 ( ω ) = ˆ ˆ f 2 ( ω ) , ∀ ω ∈ Ω

  14. 14 ℓ 1 Relaxation? Solve convex optimization problem (LP for real-valued signals) � g | Ω = ˆ ( P 1 ) min � g � ℓ 1 := | g ( t ) | , ˆ f | Ω g t • Example: Dirac’s comb √ N equispaced spikes ( N perfect square). – – Invariant through Fourier transform ˆ f = f √ N with ˆ – Can find | Ω | = N − f ( ω ) = 0 , ∀ ω ∈ Ω . – Can’t reconstruct • More dramatic examples exist • But all these examples are very special

  15. 15 Dirac’s Comb f(t) f( ω ) N N t ω ˆ f f

  16. 16 Main Result Theorem 2 Suppose | Ω | | T | ≤ α ( M ) · log N Then min- ℓ 1 reconstructs exactly with prob. greater than 1 − O ( N − M ) . (n.b. one can choose α ( M ) ∼ [29 . 6( M + 1)] − 1 . Extensions • | T | , number of jump discontinuities (TV reconstruction) • | T | , number of 2D, 3D spikes. • | T | , number of 2D jump discontinuities (2D TV reconstruction)

  17. 17 Heuristics: Robust Uncertainty Principles f unique minimizer of ( P 1 ) iff � � ∀ h, ˆ | f ( t ) + h ( t ) | > | f ( t ) | , h | Ω = 0 t t Triangle inequality � � � � � | f ( t )+ h ( t ) | = | f ( t )+ h ( t ) | + | h t | ≥ | f ( t ) |−| h ( t ) | + | h t | T T c T T c Sufficient condition | h ( t ) | ≤ 1 � � � | h ( t ) | ≤ | h ( t ) | ⇔ 2 � h � ℓ 1 T T c T Conclusion: f unique minimizer if for all h , s.t. ˆ h | Ω = 0 , it is impossible to ‘concentrate’ h on T

  18. 18 Connections • Donoho & Stark (88) • Donoho & Huo (01) • Santosa & Symes (86) • Gribonval & Nielsen (03) • Dobson & Santosa (96) • Tropp (03) and (04) • Bresler & Feng (96) • Donoho & Elad (03) • Vetterli et. al. (03) • Gilbert et al. (04)

  19. 19 Dual Viewpoint • Convex problem has a dual • Dual polynomial � ˆ P ( ω ) e iωt P ( t ) = ω ∈ Ω – P ( t ) = sgn ( f )( t ) = f ( t ) / | f ( t ) | , ∀ t ∈ T – | P ( t ) | < 1 , ∀ t ∈ T c – ˆ P supported on set Ω of visible frequencies Theorem 3 (Strong Duality) (i) If F T → Ω and there exists a dual polynomial, then the ( P 1 ) minimizer is unique and is equal to f . (ii) Conversely, if f is the unique minimizer of ( P 1 ) , then there exists a dual polynomial.

  20. 20 Dual Polynomial ^ P( ω) P(t) ω t Space Frequency

  21. 21 Construction of the Dual Polynomial � ˆ P ( ω ) e iωt P ( t ) = ω ∈ Ω • P interpolates sgn ( f ) on T • P has minimum energy

  22. 22 1 | Ω | H = I − F − 1 P Ω F R ∗ Auxiliary matrices: T e iω ( t − t ′ ) f ( t ′ ) , � � Hf ( t ) := − ω ∈ Ω t ′ ∈ E : t ′ � = t • R T is the restriction map, R T f := f | T • R ∗ T is the obvious embedding obtained by extending by zero outside of T • Identity: R T R ∗ T = I T . 1 1 P := ( R ∗ | Ω | R T H ) − 1 R T sgn ( f ) . T − | Ω | H )( I T − • Frequency support. P has Fourier transform supported in Ω 1 1 1 � � e iω ( t − t ′ ) g ( t ′ ) = � ( R ∗ g ( ω ) e iωt T − | Ω | H ) g ( t ) = ˆ | Ω | | Ω | ω ∈ Ω t ′ ∈ T ω ∈ Ω • Spatial interpolation. P obeys T − 1 T − 1 R T P = ( R T R ∗ | Ω | R T H )( R T R ∗ | Ω | R T H ) − 1 R T sgn ( f ) = R T sgn ( f ) , and so P agrees with sgn ( f ) on T .

  23. 23 Hard Things 1 1 P := ( R ∗ | Ω | R T H ) − 1 R T sgn ( f ) . T − | Ω | H )( I T − 1 • ( I T − | Ω | R T H ) invertible • | P ( t ) | < 1 , t / ∈ T Interpretation 1 | Ω | R T H = [ F T → Ω ] ∗ F T → Ω I T − i.e. invertibility means that F T → Ω = R Ω F R ∗ T is of full rank.

  24. 24 Invertibility  t = t ′ ( I T − 1 | Ω | R T H ) = I T − 1 0  H 0 ( t, t ′ ) = | Ω | H 0 , ω ∈ Ω e iω ( t − t ′ ) . t � = t ′ − �  Fact: | H 0 ( t, t ′ ) | ∼ � | Ω | � H 0 � 2 ≤ Tr ( H ∗ | H 0 ( t, t ′ ) | 2 ∼ | T | 2 · | Ω | � 0 H 0 ) = t,t ′ Want � H 0 � < | Ω | , and therefore | T | 2 · | Ω | = O ( | Ω | 2 ) � ⇔ | T | = O ( | Ω | )

  25. 25 Key Estimates • Want to show largest eigenvalue of H 0 (self-adjoint) is less than Ω . • Take large powers of random matrices Tr ( H 2 n 0 ) = λ 2 n + . . . + λ 2 n 1 T • Key estimate: develop bounds on E [ Tr ( H 2 n 0 )] ≍ γ 2 n n n | T | n +1 | Ω | n . • Key intermediate result: � � � H 0 � ≤ γ log | T | | T | | Ω | with large-probability • A lot of combinatorics!

  26. 26 Some Details • Expectation   e i P 2 n � � j =1 ω j ( t j − t j +1 ) E ( Tr ( H 2 n  . 0 )) = E  t 1 ,...,t 2 n : t j � = t j +1 ω 1 ,...,ω 2 n ∈ Ω • Truncated Neumann series ( I − H 0 ) − 1 = ( I − H 2 n 0 )( I + H 0 + . . . + H 2 n − 1 ) 0 Need to show that a random sum is less than 1.

  27. 27 Robust Uncertainty Principles, I • T = supp ( f ) • Ω = supp ( ˆ f ) Discrete uncertainty principle (Donoho & Stark 88) √ | T | + | Ω | ≥ 2 N Robust uncertainty principle (C. & Romberg 04): for nearly all T , Ω � | T | + | Ω | ≥ β · N/ log N

  28. 28 Robust Uncertainty Principles, II Stronger result (C. & Romberg 04): T and Ω obeys � | T | + | Ω | = c · N/ log N For nearly all T and Ω • Suppose that T = supp ( f ) , then � ˆ f | Ω � ≤ � ˆ f � / 2 . • Suppose that Ω = supp ( ˆ f ) , then � f | T � ≤ � f � / 2 .

  29. 29 Robust Uncertainty Principles, III Example: T = supp ( f ) , i.e. R ∗ T f = f f | Ω � 2 = � R Ω F f � 2 = � R Ω F f, R Ω F f � . � ˆ Set Φ = R Ω F R ∗ T = F T → Ω . Since R ∗ T f = f f | Ω � 2 = � f | T , Φ ∗ Φ f | T � . � ˆ But Φ ∗ Φ = 1 � � N ( | Ω | · I − H 0 ) , � H 0 � ≤ log T | T | · | Ω | . Hence, � Φ ∗ Φ � = 1 � � N ( | Ω | + log T | T | · | Ω | ) ≤ 1 / 2 if | T | + | Ω | = c · N/ √ log N .

  30. 30 Equivalence • Combinatorial optimization problem g | Ω = ˆ ( P 0 ) min � g � ℓ 0 := # { t, g ( t ) � = 0 } , ˆ f | Ω g • Convex optimization problem (LP) � g | Ω = ˆ � g � ℓ 1 := | g ( t ) | , ˆ f | Ω • Equivalence For | T | ∼ | Ω | / log N , the solutions to ( P 0 ) and ( P 1 ) are unique and are the same!

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend