Message Passing Algorithms for Compressed Sensing DLD, Arian - - PowerPoint PPT Presentation

message passing algorithms for compressed sensing
SMART_READER_LITE
LIVE PREVIEW

Message Passing Algorithms for Compressed Sensing DLD, Arian - - PowerPoint PPT Presentation

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Message Passing Algorithms for Compressed Sensing DLD, Arian Maleki, Andrea Montanari Stanford September 2, 2009 DLD, Arian Maleki,


slide-1
SLIDE 1

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms

Message Passing Algorithms for Compressed Sensing

DLD, Arian Maleki, Andrea Montanari

Stanford

September 2, 2009

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-2
SLIDE 2

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms

Outline

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-3
SLIDE 3

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms

Compressed Sensing – the heuristic

◮ Real images and signals are compressible ◮ Equivalently: Few large coefficients, eg in Wavelet basis ◮ Fewer than nominal degrees of freedom

not 106 pixels, just 104 wavelet coeffs, + positions of those coefficients

◮ Standard sampling: 106 measurements ◮ “Morally” c · 104 measurements should suffice

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-4
SLIDE 4

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms

Compressed Sensing MRI

◮ ”Sparse MRI – The Application of Compressed Sensing in

Magnetic Resonance Imaging” – Michael Lustig, DLD, John Pauly 2007, Magnetic Resonance in Medicine

◮ “Compressed Sensing MRI” – Michael Lustig, John Pauly,

Juan Santos, DLD IEEE Signal Processing Special Issue on CS, March 2008

◮ Inspired by CS theory

◮ Rapid Contrast-Enhanced 3D Angiography, ◮ Whole-Heart Coronary Imaging, ◮ Brain Imaging, and ◮ Dynamic Heart Imaging. DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-5
SLIDE 5

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms

A Theoretical Formalization of Compressed Sensing

Ingredients

◮ Sparsity.

An object x0 ∈ RN with k ≪ N nonzero coefficients in a fixed basis

◮ Random Undersampled Measurements

Measure y = Ax0 with random n by N matrix A (eg iid Gaussian).

◮ Nonlinear Reconstruction Attempt reconstruction with x1 solving

(P1) min x1 subject to y = Ax AKA: Minimum ℓ1, Basis Pursuit.

◮ Computationally Feasible; compare NP-Hard:

(P0) min x0 subject to y = Ax

◮ Surprise: often in the n < N underdetermined case (P0) and (P1) will

have the same, unique solution.

◮ Voluminous literature IEEE 2001-today.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-6
SLIDE 6

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

Phase Transition for ±

  • Theorem. There is a function ρCG(δ, ±) with the following characteristics:

Fix δ > 0. If n/N > ρCG(k/n; ±)(1 + δ) then with overwhelming probabilty for large n, N, x1 = x0 If n/N < ρCG(k/n, ±)(1 − δ), then with overwhelming probabilty for large n, N, x1 = x0.

DLD (Discr. & Comput. Geom., 2006); DLD and Jared Tanner (JAMS, 2009) fully rigorous, explicit calculation of ρCG using special functions. Methods: Combinatorial Geometry [Affentranger and Schneider (1992), Vershik and Sporyshev (1992)].

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-7
SLIDE 7

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

Empirical Results: ±

Random Matrix A, δ = k/n, ρ = n/N, N = 400.

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.10 0.50 0.90

δ=n/N ρ=k/n

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-8
SLIDE 8

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

Nonnegative coefficients

◮ Underdetermined system of equations:

y = Ax, x ≥ 0.

◮ Sparsest Solution:

(NP+) x0 = argmin x0 s.t. y = Ax, x ≥ 0

◮ Problem: NP-hard in general. ◮ Relaxation:

(LP+) x1 = argmin 1′x s.t. y = Ax, x ≥ 0. Standard linear program

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-9
SLIDE 9

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

Phase Transition for +

Theorem There is a function ρCG(·, +) : (0, 1] → (0, 1] with the following characteristics: Fix ǫ > 0. If n/N > ρCG(k/n, +)(1 + ǫ) then with overwhelming probabilty for large n, N, x1 = x0 If n/N < ρCG(k/n, +)(1 − ǫ), then x1 = x0. DLD and Tanner (PNAS, 2005[a,b]) fully rigorous, explicit calculation of ρCG using special functions. Methods: Combinatorial Geometry [Affentranger and Schneider (1992), Vershik and Sporyshev (1992)].

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-10
SLIDE 10

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

Empirical Results +

Random Matrix A, δ = k/n, ρ = n/N. N = 400.

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.10 0.50 0.90

δ=n/N ρ=k/n

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-11
SLIDE 11

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

Object with k-Simple, Bounded Coefficients

◮ Underdetermined system of equations:

y = Ax, −1 ≤ x(i) ≤ 1, 1 ≤ i ≤ N.

◮ Simplicity: #{i : |x(i)| = 1} small. ◮ Simplest Solution:

(NP) x0 = argmin#{i : |x(i)| = 1} s.t. y = Ax, x ≥ 0 Problem: NP-hard in general.

◮ Relaxation:

(LP) x1 = argmin 1′x s.t. y = Ax, x(i) ∈ [−1, 1] Standard linear program (feasibility problem)

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-12
SLIDE 12

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

Phase Transition for

Theorem. Set ρCG(δ, ) = (2 − δ−1)+. Fix ǫ > 0. If n/N > ρCG(k/n, )(1 + ǫ) then, with overwhelming probabilty for large n, N, x1 = x0 If n/N < ρCG(k/n, )(1 − ǫ), then , with overwhelming probabilty for large n, N, x1 = x0. DLD and Jared Tanner (2008, in press D&CG) Exact finite n identities in Geometric Probability Methods: Wendel’s Theorem, Winder’s Theorem, Oriented Matroids.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-13
SLIDE 13

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

Empirical Results

Random Matrix A, δ = k/n, ρ = n/N. N = 400.

0.5 0.55 0.6 0.65 0.7 0.75 0.8 0.85 0.9 0.95 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.10 0.50 0.90

δ=n/N ρ=k/n

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-14
SLIDE 14

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms Object With k-Sparse Coefficients Object With k-sparse, Nonnegative Coefficients Object with k-Simple, Bounded Coefficients

The Three Theoretical PT’s

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

δ=n/N ρ=k/n

T C I

Green: +, Red ±, Blue .

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-15
SLIDE 15

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms

Fast Iterative Algorithms

◮ Problem: interesting problem sizes for MRI application:

N ≈ 106, n ≈ 104.

◮ Generic LP at least n2 · N complexity. ◮ Alternative: iterative algorithms involving C applications

10 < C < 50 of A and A∗ to appropriate vectors

◮ Complexity: CnN in general eg A iid Gaussian; ◮ Complexity: CN log(N) for FFT-based matrices. ◮ Heavily used for large-scale applications:

Jean-Luc Starck (2003-), Miki Elad (2004-), ....

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-16
SLIDE 16

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms

Simple Iterative Algorithms

Assume columns of A have unit length. xt+1 = ηt(A∗zt + xt) , (1) zt = y − Axt. (2)

◮ Thresholding ηt( · ) = η(·; λσt, χ)

◮ λ is a threshold control parameter; ◮ χ ∈ {+, ±, } controls nonlinearity shape. ◮ σ2

t = AvejE{(xt(j) − x0(j))2} is MSE

◮ “Iterative Soft Thresholding”. ◮ Only apply A and A∗; no need for matrix A itself.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-17
SLIDE 17

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms

Thresholding Functions – scalar y, t > 0:

◮ Case +

ηt(y; +) =  y − t y > t y < t

◮ Case ±

ηt(y; ±) = 8 < : y − t y > t |y| < t y + t y < −t

◮ Case

ηt(y; ) = 8 < : t y > t y |y| < t −t y < −t

◮ Connection to optimization:

ηt(y; ±) = argminx(y − x)2/2 + λ|y|.

◮ ‘Shrinkage towards 0’. ’Soft Thresholding’ . Used in ‘Wavelet Shrinkage’

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-18
SLIDE 18

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

Can Iterative Thresholding Algorithms Work?

◮ Easy: A orthogonal: solves in one iteration. ◮ Hard: A nonsingular: with care, asymptotically solves LP

Daubechies, De Mol, Bertero.

◮ Weird: A, n < N.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-19
SLIDE 19

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

Mutual Access Interference (MAI) Heuristic

◮ Assume columns of A normalized to unit length. ◮ Set H = A∗A − I; call Hx0 the MAI (aka cross-channel

interference)

◮ A∗y = x0 + Hx0 = x0 + MAI ◮ For ‘typical’ vector v, ‘random’ matrix A, Hv has marginal

distribution that’s N(0, v2).

◮ A∗y = x0 + Noise. ◮ First Iteration: x1 = ηt(A∗zt + x0) ◮ Shrinkage heuristic

◮ Shrinkage kills ‘small elements’; keeps ‘large elements’ ◮ Small elements mostly noise; large elements mostly signal. ◮ x1 − x02

2 ≪ 0 − x02 2: off to the races!

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-20
SLIDE 20

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

Sketch of MAI/Thresholding Heuristics

Lustig DLD Pauly IEE Signal Processing Magazine 2008

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-21
SLIDE 21

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

State Evolution (SE) Definition

MSE Map: Ψ(σ2) ≡ E

  • η
  • X + σ

√ δ Z; λσ

  • − X

2 . (3) State Evolution

◮ Implicit parameters (χ, δ, ρ, λ, F), F = FX marginal dist. rv X. ◮ Explicit paramater σ2. ◮ State Evolution

σ2

t → Ψ(σ2 t ) ≡ σ2 t+1

(Implicit parameters (χ, δ, ρ, λ, F) fixed)

◮ Full state evolves as

(σ2

t ; χ, δ, ρ, λ, FX) → (Ψ(σ2 t ); χ, δ, ρ, λ, FX) .

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-22
SLIDE 22

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

State Evolution (SE) Asymptotics

Two Regions of parameter space: Region (I) Ψ(σ2) < σ2 for all σ2 ∈ (0, EX 2]. Here σ2

t → 0 as t → ∞: SE evolves to σ2 = 0.

Region (II) The complement of Region (I). SE does not evolve to σ2 = 0. State Evolution Phase Boundary: ρSE(δ; χ, λ, FX) ≡ sup {ρ : (δ, ρ, λ, FX) ∈ Region (I)} . (4)

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-23
SLIDE 23

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

Independence from Object Distribution

Theorem For every distribution F = FX with no atom at zero and finite second moment EX 2 < ∞, ρSE(δ; χ, λ, F) = ρSE(δ; χ, λ). Conclude:

◮ δ, ρ are ‘what matter’ to SE. ◮ Graph of ρSE: boundary between phases of (δ, ρ) phase

diagram.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-24
SLIDE 24

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

Computational Geometry-State Evolution Agreement

Finding

For the three canonical problems χ ∈ {+, ±, }: ρSE(δ; χ) = ρCG(δ; χ) ∀ δ ∈ (0, 1) (5)

◮ Numerically agree to high accuracy. ◮ Rigorously proven equality χ = . ◮ Rigorously proven asymp. equality χ ∈ {+, ±} in limit δ → 0.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-25
SLIDE 25

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

Large Scale Testing of Iterative Algorithms

Maleki and DLD (2009, submitted, IEEE Select. Topics in SP)

◮ Algorithms:

IST, IHT, CoSamp, Subspace Pursuit , ...

◮ Matrix Ensembles A:

Gaussian, Bernoulli, partial Fourier, partial Hadamard, ...

◮ Object Distributions F:

Constant, Double Exponential, Uniform, Power Law, ...

◮ Problem sizes: N = 800 and 2ℓ, ℓ = 10, 12, 14. ◮ Goal: measure phase transitions; tune algorithms to optimize

transitions.

◮ 3.8 CPU years

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-26
SLIDE 26

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms MAI Heuristic State Evolution Heuristic Failure of Simple Iterative Algorithms

Empirical Results

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-27
SLIDE 27

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Message Passing Iterative Thresholding

Message Passing successful in many other problems: (examples Gallagher LDPC, Yedidia, Wainwright, Willsky , ...). Adapt to iterative Thresholding.

N thresholding reconstructions: xt+1

i→a

= ηt “ X

b∈[n]\a

Abizt

b→i

” , (6) N residuals: zt

a→i

= ya − X

j∈[N]\i

Aajxt

j→a ,

(7) for each (i, a) ∈ [N] × [n].

Problem: N iterative algorithms in semi-parallel means N times as much work as simple iterative algorithms.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-28
SLIDE 28

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Approximate Message Passing (AMP)

First order Approximate Message Passing (AMP) algorithm xt+1 = ηt(A∗zt + xt) , (8) zt = y − Axt + 1 δ zt−1η′

t−1(A∗zt−1 + xt−1) .

(9) η′

t( s ) = ∂ ∂s ηt( s ).

Feature: Essentially same cost as Iterative Soft Thresholding.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-29
SLIDE 29

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

SE is Correct for AMP

Finding

For the AMP algorithm, and large dimensions N, n, we observe:

  • I. SE correctly predicts evolution of numerous statistical properties of xt with t.

The MSE, the number of nonzeros in xt, the number of false alarms, the number of missed detections, and several other measures all evolve in way consistent with the state evolution formalism to within experimental accuracy.

  • II. SE correctly predicts the success/failure to converge to the correct result. In

particular, SE predicts no convergence when ρ > ρSE(δ; χ, λ), and convergence if ρ < ρSE(δ; χ, λ). This is indeed observed empirically.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-30
SLIDE 30

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Finding 1.a MAI Heuristic is Correct for AMP

−4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=10 −4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=20 −4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=30 −4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=40 −4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=50 −4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=60 −4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=70 −4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=80 −4 −2 2 4 −1.5 −1 −0.5 0.5 1 1.5 Standard Normal Quantile Quantile of Input Sample iteration=90

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-31
SLIDE 31

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Finding 1.b State Evolution is Correct for AMP Observables, 1

10 20 30 40 0.05 0.1 0.15 0.2 0.25 0.3 0.35 MSE on the non−zero elements iteration empirical theoretical 10 20 30 40 0.05 0.1 0.15 0.2 0.25 0.3 0.35 MSE on zero elements iteration empirical theoretical 10 20 30 40 0.01 0.02 0.03 0.04 0.05 0.06 0.07 0.08 Missed Detection Rate iteration empirical theoretical 10 20 30 40 0.6985 0.699 0.6995 0.7 0.7005 0.701 0.7015 0.702 False Alarm Rate iteration empirical theoretical

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-32
SLIDE 32

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Finding 1.b State Evolution is Correct for AMP Observables, 2

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-33
SLIDE 33

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Finding 2. SE Transition agrees/w AMP Transition, 1

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

δ ρ

Comparison of Different Algorithms IST L1 AMP

Random Matrix A, δ = k/n, ρ = n/N. N = 1000; Blue: CG; Red:AMP; Green: IST.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-34
SLIDE 34

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Finding 2. SE Transition agrees/w AMP Transition, 2

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

δ ρ

Random Matrix A, δ = k/n, ρ = n/N. Dashed Empirical; Solid Asymptotic; N = 1000; Red: +; Blue: ±; Green: .

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-35
SLIDE 35

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Universality

◮ Same experiment, different coefficient distributions

◮ Constant amplitude nonzeros ◮ Power Law ◮ Gaussian

◮ Same experiment, with different matrix ensembles

◮ Bernoulli: Fair Coin Tossing ±1. ◮ Partial Fourier: sample n rows from N × N Fourier ◮ Partial Hadamard: sample n rows from N × N Fourier ◮ Gaussian: Ai,j ∼ N(0, 1) ◮ Uniform Random Projection: A uniform on Haar measure for

  • rthoprojectors

Empirically: same behavior. Matches findings for LP optimization in Tanner and DLD (2009, In Press, Phil Trans Roy Soc A)

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing

slide-36
SLIDE 36

Compressed Sensing Phase Transitions Simple Iterative Algorithms Heuristics Message Passing Algorithms SE is Correct for AMP

Conclusions

◮ Phase Transitions for LP: rigorous foundation in Combinatorial Geometry ◮ Standard Iterative Algorithms heuristically like LP but not CG PT’s ◮ AMP Algorithm slight modification of SIA, does achieve CG PT’s ◮ State Evolution explains properties of AMP ◮ SE generates new formulas for PTs, for optimal thresholds in algorithms.

DLD, Arian Maleki, Andrea Montanari Message Passing Algorithms for Compressed Sensing