simultaneous sparsity
play

Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. - PowerPoint PPT Presentation

Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp|annacg|martinjs}@umich.edu Department of Mathematics The University of Michigan 1 Simple Sparse Approximation Work in the d -dimensional, complex


  1. Simultaneous Sparsity ❦ Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp|annacg|martinjs}@umich.edu Department of Mathematics The University of Michigan 1

  2. Simple Sparse Approximation ❦ ❧ Work in the d -dimensional, complex inner-product space C d ❧ Let { ϕ ω : ω ∈ Ω } be a collection of unit-norm elementary signals ❧ Choose T indices λ 1 , . . . , λ T ∈ Ω ❧ Suppose we measure a noisy sparse signal T � = c t ϕ λ t + ν s t =1 ❧ The simple sparse approximation problem asks 1. Can we identify the indices λ 1 , . . . , λ T ? 2. Can we estimate the coefficients c 1 , . . . , c T ? Simultaneous Sparse Approximation (ICASSP 2005) 2

  3. Facts about Greedy Algorithms ❦ A greedy algorithm for sparse approximation makes locally optimal choices in an effort to obtain a good global solution. Advantages ❧ Fast ❧ Easy to implement ❧ Work well for many problems Disadvantages ❧ Less robust than ℓ 1 methods ❧ Not effective for superresolution Simultaneous Sparse Approximation (ICASSP 2005) 3

  4. Orthogonal Matching Pursuit (OMP) ❦ Input: A signal s and the number of terms T Output: Indices { λ 1 , . . . , λ T } and coefficients { c 1 , . . . , c T } 1. Set the initial residual r 0 = s and the counter t = 1 2. Find an index λ t that solves max ω ∈ Ω |� r t − 1 , ϕ ω �| 3. Determine the orthogonal projector P t onto span { ϕ λ 1 , . . . , ϕ λ t } 4. Calculate the new residual: r t = s − P t s 5. Increment t , and repeat until t = T 6. The coefficient estimates appear in the expansion � T = P T s t =1 c t ϕ λ t Simultaneous Sparse Approximation (ICASSP 2005) 4

  5. Simultaneous Sparse Approximation ❦ Idea: More observations should make the problem easier ❧ Choose T indices λ 1 , . . . , λ T ∈ Ω ❧ Suppose we measure K noisy sparse signals T � = c tk ϕ λ t + ν k s k t =1 ❧ The simultaneous sparse approximation problem asks 1. Can we identify the indices λ 1 , . . . , λ T ? 2. Can we estimate the set of coefficients { c tk } ? Simultaneous Sparse Approximation (ICASSP 2005) 5

  6. Application: MIMO Communications ❦ Transmit 1: Receive 1: � t h t 1 ϕ λ t + ν 1 ϕ λ 1 ���� Transmit 2: Receive 2: � t h t 2 ϕ λ t + ν 2 ϕ λ 2 ���� · · · · · · Transmit t : Receive k : � t h tk ϕ λ t + ν k ϕ λ t ���� ❧ The dimension d corresponds with the length of a transmission block ❧ Send one elementary signal on each of T transmit antennas ❧ Measure one superposition on each of K receive antennas ❧ The numbers h tk are fading coefficients ❧ The vectors ν k are additive noise Goal: Identify which elementary signals were transmitted Simultaneous Sparse Approximation (ICASSP 2005) 6

  7. Simultaneous OMP ❦ Input: A d × K signal matrix S and the number of terms T Output: Indices { λ 1 , . . . , λ T } 1. Set the initial residual R 0 = S and the counter t = 1 2. Find an index λ t that solves K � max |� R t − 1 e k , ϕ ω �| ω ∈ Ω k =1 3. Determine the orthogonal projector P t onto span { ϕ λ 1 , . . . , ϕ λ t } 4. Calculate the new residual: R t = S − P t S 5. Increment t , and repeat until t = T Simultaneous Sparse Approximation (ICASSP 2005) 7

  8. Experiments with S-OMP ❦ ❧ The Z 4 Kerdock code yields 64 elementary signals in C 8 ❧ Fix the number of transmit/receive antennas and the SNR ❧ For each trial, we construct K signals T � = h tk ϕ λ t + ν k s k t =1 where h tk are Gaussian variables and ν k are Gaussian vectors ❧ S-OMP is used to pick T elementary signals ❧ Calculate the fraction correctly identified ❧ Average over 1000 trials Simultaneous Sparse Approximation (ICASSP 2005) 8

  9. Hamming distance per receive antenna as a function of SNR at 4 transmit antennas 20dB 0.55 16dB 13dB 11.2dB 0.5 10dB 6dB 0.45 0.4 hamming distance 0.35 0.3 0.25 0.2 0.15 0.1 0.05 1 2 3 4 5 6 7 8 number of receive antennas Simultaneous Sparse Approximation (ICASSP 2005) 9

  10. Hamming distance as a function of k receive antennas at SNR = 20dB k=1 k=2 0.7 k=3 k=4 k=5 k=6 0.6 k=7 k=8 0.5 hamming distance 0.4 0.3 0.2 0.1 0 2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 number of transmit antennas Simultaneous Sparse Approximation (ICASSP 2005) 10

  11. A Theoretical Result for S-OMP ❦ Claim: Each result for simple sparse approximation has an analog for simultaneous sparse approximation. Define the coherence parameter µ = max λ � = ω |� ϕ λ , ϕ ω �| Suppose that T µ ≤ 1 Theorem 1. 3 . Let S be a signal matrix. After T iterations, S-OMP calculates a T -term approximation A T of the signal matrix that satisfies √ � S − A T � F ≤ 1 + 6 KT � S − A opt � F where A opt is the optimal T -term approximation of S in Frobenius norm. Simultaneous Sparse Approximation (ICASSP 2005) 11

  12. Convex Relaxation for SSA ❦ ❧ Can view simultaneous sparse approximation as a combinatorial optimization problem min # nonzero rows of C subject to � S − Φ C � F ≤ ε C ❧ Can replace this combinatorial problem with a related convex program � min max | c ωk | subject to � S − Φ C � F ≤ δ C k ω ❧ One can prove the two problems often have similar solutions Simultaneous Sparse Approximation (ICASSP 2005) 12

  13. Related Work ❦ ❧ C ¸etin, Malioutov, Willsky (Algorithms, Applications) ❧ Chen, Huo (Theory) ❧ Cotter, Engan, Kreutz-Delgado, Rao, et al. (Algorithms, Applications) ❧ Gribonval, Nielsen (Theory, Applications) ❧ Leviatan, Lutoborsky, Temlyakov (Theory) Simultaneous Sparse Approximation (ICASSP 2005) 13

  14. Publications and Preprints ❦ ❧ TGS, “Simultaneous sparse approximation via greedy pursuit,” ICASSP 2005. ❧ TGS, “Algorithms for simultaneous sparse approximation. Part I: Greedy pursuit,” submitted November 2004. ❧ T, “Algorithms for simultaneous sparse approximation. Part II: Convex relaxation,” submitted November 2004. ❧ GT, “Applications of sparse approximation in communications,” submitted January 2005. ❧ TG, “Signal recovery from partial information via Orthogonal Matching Pursuit,” submitted March 2005. Papers available from http://www.umich.edu/~jtropp/ . Contact information: {jtropp|annacg|martinjs}@umich.edu . Simultaneous Sparse Approximation (ICASSP 2005) 14

  15. Greed: Still Good ❦ Percentage of input signals recovered (d = 256) 100 90 80 70 Percentage recovered 60 50 40 30 20 m=4 m=12 m=20 10 m=28 m=36 0 50 100 150 200 250 N umber of measurements (N) Simultaneous Sparse Approximation (ICASSP 2005) 15

  16. Greed: Still Good ❦ Suppose that s is an arbitrary m -sparse signal in R d . Given Theorem 2. K p m log d random linear measurements of s , OMP can recover s with probability 1 − O ( d − p ) . This theorem is more or less equivalent with results for ℓ 1 minimization due to Cand` es–Tao, Donoho, and Rudelson–Vershynin. Simultaneous Sparse Approximation (ICASSP 2005) 16

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend