Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. - - PowerPoint PPT Presentation

simultaneous sparsity
SMART_READER_LITE
LIVE PREVIEW

Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. - - PowerPoint PPT Presentation

Simultaneous Sparsity Joel A. Tropp Anna C. Gilbert Martin J. Strauss {jtropp|annacg|martinjs}@umich.edu Department of Mathematics The University of Michigan 1 Simple Sparse Approximation Work in the d -dimensional, complex


slide-1
SLIDE 1

Simultaneous Sparsity

Joel A. Tropp Anna C. Gilbert Martin J. Strauss

{jtropp|annacg|martinjs}@umich.edu Department of Mathematics The University of Michigan

1

slide-2
SLIDE 2

Simple Sparse Approximation

❦ ❧ Work in the d-dimensional, complex inner-product space Cd ❧ Let {ϕω : ω ∈ Ω} be a collection of unit-norm elementary signals ❧ Choose T indices λ1, . . . , λT ∈ Ω ❧ Suppose we measure a noisy sparse signal s =

T

  • t=1

ct ϕλt + ν ❧ The simple sparse approximation problem asks

  • 1. Can we identify the indices λ1, . . . , λT?
  • 2. Can we estimate the coefficients c1, . . . , cT?

Simultaneous Sparse Approximation (ICASSP 2005) 2

slide-3
SLIDE 3

Facts about Greedy Algorithms

❦ A greedy algorithm for sparse approximation makes locally optimal choices in an effort to obtain a good global solution.

Advantages

❧ Fast ❧ Easy to implement ❧ Work well for many problems

Disadvantages

❧ Less robust than ℓ1 methods ❧ Not effective for superresolution

Simultaneous Sparse Approximation (ICASSP 2005) 3

slide-4
SLIDE 4

Orthogonal Matching Pursuit (OMP)

❦ Input: A signal s and the number of terms T Output: Indices {λ1, . . . , λT} and coefficients {c1, . . . , cT}

  • 1. Set the initial residual r0 = s and the counter t = 1
  • 2. Find an index λt that solves

maxω∈Ω |rt−1, ϕω|

  • 3. Determine the orthogonal projector Pt onto span {ϕλ1, . . . , ϕλt}
  • 4. Calculate the new residual: rt = s − Pt s
  • 5. Increment t, and repeat until t = T
  • 6. The coefficient estimates appear in the expansion

PT s = T

t=1 ct ϕλt

Simultaneous Sparse Approximation (ICASSP 2005) 4

slide-5
SLIDE 5

Simultaneous Sparse Approximation

Idea: More observations should make the problem easier

❧ Choose T indices λ1, . . . , λT ∈ Ω ❧ Suppose we measure K noisy sparse signals sk =

T

  • t=1

ctk ϕλt + νk ❧ The simultaneous sparse approximation problem asks

  • 1. Can we identify the indices λ1, . . . , λT?
  • 2. Can we estimate the set of coefficients {ctk}?

Simultaneous Sparse Approximation (ICASSP 2005) 5

slide-6
SLIDE 6

Application: MIMO Communications

❦ Transmit 1: ϕλ1

  • Receive 1:
  • t ht1 ϕλt + ν1

Transmit 2: ϕλ2

  • Receive 2:
  • t ht2 ϕλt + ν2

· · · · · · Transmit t: ϕλt

  • Receive k:
  • t htk ϕλt + νk

❧ The dimension d corresponds with the length of a transmission block ❧ Send one elementary signal on each of T transmit antennas ❧ Measure one superposition on each of K receive antennas ❧ The numbers htk are fading coefficients ❧ The vectors νk are additive noise

Goal: Identify which elementary signals were transmitted

Simultaneous Sparse Approximation (ICASSP 2005) 6

slide-7
SLIDE 7

Simultaneous OMP

❦ Input: A d × K signal matrix S and the number of terms T Output: Indices {λ1, . . . , λT}

  • 1. Set the initial residual R0 = S and the counter t = 1
  • 2. Find an index λt that solves

max

ω∈Ω K

  • k=1

|Rt−1 ek, ϕω|

  • 3. Determine the orthogonal projector Pt onto span {ϕλ1, . . . , ϕλt}
  • 4. Calculate the new residual: Rt = S − Pt S
  • 5. Increment t, and repeat until t = T

Simultaneous Sparse Approximation (ICASSP 2005) 7

slide-8
SLIDE 8

Experiments with S-OMP

❦ ❧ The Z4 Kerdock code yields 64 elementary signals in C8 ❧ Fix the number of transmit/receive antennas and the SNR ❧ For each trial, we construct K signals sk =

T

  • t=1

htk ϕλt + νk where htk are Gaussian variables and νk are Gaussian vectors ❧ S-OMP is used to pick T elementary signals ❧ Calculate the fraction correctly identified ❧ Average over 1000 trials

Simultaneous Sparse Approximation (ICASSP 2005) 8

slide-9
SLIDE 9

1 2 3 4 5 6 7 8 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 0.5 0.55 number of receive antennas hamming distance Hamming distance per receive antenna as a function of SNR at 4 transmit antennas 20dB 16dB 13dB 11.2dB 10dB 6dB Simultaneous Sparse Approximation (ICASSP 2005) 9

slide-10
SLIDE 10

2 2.5 3 3.5 4 4.5 5 5.5 6 6.5 7 0.1 0.2 0.3 0.4 0.5 0.6 0.7 number of transmit antennas hamming distance Hamming distance as a function of k receive antennas at SNR = 20dB k=1 k=2 k=3 k=4 k=5 k=6 k=7 k=8 Simultaneous Sparse Approximation (ICASSP 2005) 10

slide-11
SLIDE 11

A Theoretical Result for S-OMP

Claim: Each result for simple sparse approximation has an analog for

simultaneous sparse approximation. Define the coherence parameter µ = maxλ=ω |ϕλ, ϕω| Theorem 1. Suppose that T µ ≤ 1

  • 3. Let S be a signal matrix. After T

iterations, S-OMP calculates a T-term approximation AT of the signal matrix that satisfies S − ATF ≤ √ 1 + 6 KT S − AoptF where Aopt is the optimal T-term approximation of S in Frobenius norm.

Simultaneous Sparse Approximation (ICASSP 2005) 11

slide-12
SLIDE 12

Convex Relaxation for SSA

❦ ❧ Can view simultaneous sparse approximation as a combinatorial

  • ptimization problem

min

C

# nonzero rows of C subject to S − Φ CF ≤ ε ❧ Can replace this combinatorial problem with a related convex program min

C

  • ω

max

k

|cωk| subject to S − Φ CF ≤ δ ❧ One can prove the two problems often have similar solutions

Simultaneous Sparse Approximation (ICASSP 2005) 12

slide-13
SLIDE 13

Related Work

❦ ❧ C ¸etin, Malioutov, Willsky (Algorithms, Applications) ❧ Chen, Huo (Theory) ❧ Cotter, Engan, Kreutz-Delgado, Rao, et al. (Algorithms, Applications) ❧ Gribonval, Nielsen (Theory, Applications) ❧ Leviatan, Lutoborsky, Temlyakov (Theory)

Simultaneous Sparse Approximation (ICASSP 2005) 13

slide-14
SLIDE 14

Publications and Preprints

❦ ❧ TGS, “Simultaneous sparse approximation via greedy pursuit,” ICASSP 2005. ❧ TGS, “Algorithms for simultaneous sparse approximation. Part I: Greedy pursuit,” submitted November 2004. ❧ T, “Algorithms for simultaneous sparse approximation. Part II: Convex relaxation,” submitted November 2004. ❧ GT, “Applications of sparse approximation in communications,” submitted January 2005. ❧ TG, “Signal recovery from partial information via Orthogonal Matching Pursuit,” submitted March 2005. Papers available from http://www.umich.edu/~jtropp/. Contact information: {jtropp|annacg|martinjs}@umich.edu.

Simultaneous Sparse Approximation (ICASSP 2005) 14

slide-15
SLIDE 15

Greed: Still Good

50 100 150 200 250 10 20 30 40 50 60 70 80 90 100 Number of measurements (N) Percentage recovered Percentage of input signals recovered (d = 256) m=4 m=12 m=20 m=28 m=36

Simultaneous Sparse Approximation (ICASSP 2005) 15

slide-16
SLIDE 16

Greed: Still Good

❦ Theorem 2. Suppose that s is an arbitrary m-sparse signal in Rd. Given Kp m log d random linear measurements of s, OMP can recover s with probability 1 − O(d−p). This theorem is more or less equivalent with results for ℓ1 minimization due to Cand` es–Tao, Donoho, and Rudelson–Vershynin.

Simultaneous Sparse Approximation (ICASSP 2005) 16