Weighted Superimposed Codes and Constrained Compressed Sensing Wei - - PowerPoint PPT Presentation

weighted superimposed codes and constrained compressed
SMART_READER_LITE
LIVE PREVIEW

Weighted Superimposed Codes and Constrained Compressed Sensing Wei - - PowerPoint PPT Presentation

Weighted Superimposed Codes and Constrained Compressed Sensing Wei Dai (ECE UIUC) Joint work with Olgica Milenkovic (ECE UIUC) University of Illinois at Urbana-Champaign DIMACS 2009 Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS


slide-1
SLIDE 1

Weighted Superimposed Codes and Constrained Compressed Sensing

Wei Dai (ECE UIUC)

Joint work with Olgica Milenkovic (ECE UIUC)

University of Illinois at Urbana-Champaign

DIMACS 2009

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 1 / 15

slide-2
SLIDE 2

Compressed Sensing

Classic setup

Kashin, 1977; Bresler et al., 1999; Donoho et al., 2004; Candés et al., 2005; · · ·

Only one constraint

◮ x ∈ RN is K-sparse Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 2 / 15

slide-3
SLIDE 3

Constrained Compressed Sensing

Constraints on x

◮ xi’s are correlated (Dai & Milenkovic; Baraniuk, et al.; · · · ). ◮ xi are bounded integers. ◮ May improve performance.

Constraints on Φ

◮ Sparse/structured (Dai & Milenkovic; Indyk, et al.; Do, et al.; Strauss, et

al.).

◮ lp-norm + nonnegativity. ◮ May introduce performance loss.

Performance requirement on noise tolerance.

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 3 / 15

slide-4
SLIDE 4

Application 1: CS DNA Microarrays

DNA Microarray: measures the concentration of certain molecules (such as mRNA) for tens of thousands of genes simultaneously. Major issue: each sequence has a unique identifier ⇒ high cost. CS DNA Microarray (Dai, Sheikh, Milenkovic and Baraniuk; Hassibi) Constraints: x : xi =the # of certain molecules. |xi| ≤ t: Bounded integer. Φ: Φi,j =the affinity (the probability) between the probe and target. Φil1 = 1, Φi,j ≥ 0.

The same model works for low light imaging, drug screening· · ·

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 4 / 15

slide-5
SLIDE 5

Application 2: Multiuser Communications

A multi-access channel with K users y = K

i=1 hi

√Piti + e. ti ∈ Ci Ci: ith user’s codebook |Ci| = ni

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 5 / 15

slide-6
SLIDE 6

Application 2: Multiuser Communications

A multi-access channel with K users y = K

i=1 hi

√Piti + e. ti ∈ Ci Ci: ith user’s codebook |Ci| = ni

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 5 / 15

slide-7
SLIDE 7

Questions regarding to Constrained CS (CCS)

How to analyze the gain/loss for a given set of constraints? How do the constraints affect the reconstruction algorithms?

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 6 / 15

slide-8
SLIDE 8

Questions regarding to Constrained CS (CCS)

How to analyze the gain/loss for a given set of constraints? How do the constraints affect the reconstruction algorithms? Our Observation: coding theoretic techniques help.

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 6 / 15

slide-9
SLIDE 9

Superimposed Codes

Euclidean Superimposed Codes (Ericson and Györfi, 1988)

◮ xi = 0/1. ◮ vi2 = 1. ◮ Distance requirement

⇒ deterministic noise tolerance. Φ (x1 − x2)2 ≥ d ∀x1 = x2

Applications ⇒ Weighted superimposed codes (WSC) (D. and

Milenkovic, 2008)

◮ |xi| ≤ t is an integer. ◮ vip = 1. ◮ Distance requirement

Φ (x1 − x2)p ≥ d ∀x1 = x2.

A hybrid of CS and Euclidean superimposed codes

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 7 / 15

slide-10
SLIDE 10

Superimposed Codes

Euclidean Superimposed Codes (Ericson and Györfi, 1988)

◮ xi = 0/1. ◮ vi2 = 1. ◮ Distance requirement

⇒ deterministic noise tolerance. Φ (x1 − x2)2 ≥ d ∀x1 = x2

Applications ⇒ Weighted superimposed codes (WSC) (D. and

Milenkovic, 2008)

◮ |xi| ≤ t is an integer. ◮ vip = 1. ◮ Distance requirement

Φ (x1 − x2)p ≥ d ∀x1 = x2.

A hybrid of CS and Euclidean superimposed codes

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 7 / 15

slide-11
SLIDE 11

Rate Bounds for WSCs

Definition: Let N (m, K, d, t) = max {N : ∃C}. The asymptotic code rate is defined as R (K, d, t) = lim sup

m→∞ log N(m,K,d,t) m

. Theorem: For Euclidean norm,

log K 4K (1 + o (1)) ≤ R (K, d, t) ≤ log K 2K (1 + ot,d (1)).

For l1-WSC and nonnegative l1-WSC

log K 4K (1 + o (1)) ≤ R (K, d, t) ≤ log K K

(1 + ot,d (1)).

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 8 / 15

slide-12
SLIDE 12

Rate Bounds for WSCs

Definition: Let N (m, K, d, t) = max {N : ∃C}. The asymptotic code rate is defined as R (K, d, t) = lim sup

m→∞ log N(m,K,d,t) m

. Theorem: For Euclidean norm,

log K 4K (1 + o (1)) ≤ R (K, d, t) ≤ log K 2K (1 + ot,d (1)).

For l1-WSC and nonnegative l1-WSC

log K 4K (1 + o (1)) ≤ R (K, d, t) ≤ log K K

(1 + ot,d (1)).

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 8 / 15

slide-13
SLIDE 13

Interpretation

For WSCs, K log N log K ≤ m ≤ 4K log N log K . The bounds are not independent of d ⇒ can make the distance arbitrarily close to one. For classic CS, m ≥ O

  • K log

N K

  • .

No performance garantee under noise.

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 9 / 15

slide-14
SLIDE 14

The Proof of the Upper Bound

Low-hanging fruit: sphere-packing bound: Minimum distance d ⇒ Balls B

  • Φx, d

2

  • are disjoint

K

  • k=1

N k

  • (2t)k ≤
  • tK + d

2 d 2

m ⇒ log N m ≤ log K K . High-hanging fruit: a large fraction of balls lie in the sphere of a smaller radius.

log N m

≤ log

√ K K

= log K

2K .

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 10 / 15

slide-15
SLIDE 15

The Proof of the Upper Bound

Low-hanging fruit: sphere-packing bound: Minimum distance d ⇒ Balls B

  • Φx, d

2

  • are disjoint

K

  • k=1

N k

  • (2t)k ≤
  • tK + d

2 d 2

m ⇒ log N m ≤ log K K . High-hanging fruit: a large fraction of balls lie in the sphere of a smaller radius.

log N m

≤ log

√ K K

= log K

2K .

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 10 / 15

slide-16
SLIDE 16

Proof of the Lower Bounds: Random Coding

Random codes: H ∈ Rm×N =a Gaussian random matrix (Hi,j ∼ N

  • 0, 1

m

  • ).

Φ : vi = hi/ hip. d ≤ ∆yp = Φ · (x1 − x2)p. (∆y)i ≈Linear combination of Gaussian rvs. lp-norm of a Gaussian vector: large deviations. R (K, d, t) = lim sup

(m,N)→∞ log N m

≥ log K

4K (1 + o (1)) .

Difficulty with nonnegativity. Gaussian approximation. The Berry-Esseen theorem for bounding the approx. error.

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 11 / 15

slide-17
SLIDE 17

Proof of the Lower Bounds: Random Coding

Random codes: H ∈ Rm×N =a Gaussian random matrix (Hi,j ∼ N

  • 0, 1

m

  • ).

Φ : vi = hi/ hip. d ≤ ∆yp = Φ · (x1 − x2)p. (∆y)i ≈Linear combination of Gaussian rvs. lp-norm of a Gaussian vector: large deviations. R (K, d, t) = lim sup

(m,N)→∞ log N m

≥ log K

4K (1 + o (1)) .

Difficulty with nonnegativity. Gaussian approximation. The Berry-Esseen theorem for bounding the approx. error.

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 11 / 15

slide-18
SLIDE 18

Proof of the Lower Bounds: Random Coding

Random codes: H ∈ Rm×N =a Gaussian random matrix (Hi,j ∼ N

  • 0, 1

m

  • ).

Φ : vi = hi/ hip. d ≤ ∆yp = Φ · (x1 − x2)p. (∆y)i ≈Linear combination of Gaussian rvs. lp-norm of a Gaussian vector: large deviations. R (K, d, t) = lim sup

(m,N)→∞ log N m

≥ log K

4K (1 + o (1)) .

Difficulty with nonnegativity. Gaussian approximation. The Berry-Esseen theorem for bounding the approx. error.

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 11 / 15

slide-19
SLIDE 19

Proof of the Lower Bounds: Random Coding

Random codes: H ∈ Rm×N =a Gaussian random matrix (Hi,j ∼ N

  • 0, 1

m

  • ).

Φ : vi = hi/ hip. d ≤ ∆yp = Φ · (x1 − x2)p. (∆y)i ≈Linear combination of Gaussian rvs. lp-norm of a Gaussian vector: large deviations. R (K, d, t) = lim sup

(m,N)→∞ log N m

≥ log K

4K (1 + o (1)) .

Difficulty with nonnegativity. Gaussian approximation. The Berry-Esseen theorem for bounding the approx. error.

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 11 / 15

slide-20
SLIDE 20

Code Construction and Decoding Algorithms

Coding theory:

◮ Offers myriad of construction techniques. ◮ No efficient decoding methods for WSC codes were known before.

CS:

◮ Offers decoding algorithmic solutions

l1-minimization, OMP , SP , CoSaMP ...

Combination?

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 12 / 15

slide-21
SLIDE 21

Decoding

The WESC decoder: ˆ xi = round (v∗

i y) .

no iteration. OMP: K iterations. Discrete input ⇒ complexity reduction The WESC decoder: O (mN) OMP: O (KmN) Code Rate for both WESC decoder and OMP: R ≤ 1 8K2t2 ⇒ m = O

  • K2 log N
  • .

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 13 / 15

slide-22
SLIDE 22

Multiuser Interference Cancellation and Decoding

High mobility ⇒ No channel information at transmitters. Coding and decoding motivated by CS.

5 10 15 20 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 m=128, N=256, K=16: number of realization=1000 SNR (dB) Error Probability ML decoding + SIC Subspace based decoding

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 14 / 15

slide-23
SLIDE 23

Conclusion

WSCs for constrained CS: Quantified the code rate Noise tolerance Efficient decoding

Dai and Milenkovic (UIUC) WSC & Constrained CS DIMACS 2009 15 / 15