Compressive Parameter Estimation via Approximate Message Passing - - PowerPoint PPT Presentation

compressive parameter estimation via approximate message
SMART_READER_LITE
LIVE PREVIEW

Compressive Parameter Estimation via Approximate Message Passing - - PowerPoint PPT Presentation

Compressive Parameter Estimation via Approximate Message Passing Marco F. Duarte Joint work with Shermin Hamzehei IEEE ICASSP - April 22 2015 Concise Signal Structure Sparse signal: only K out of N n n x = coefficients


slide-1
SLIDE 1

Marco F. Duarte

Compressive Parameter Estimation via 
 Approximate Message Passing

Joint work with Shermin Hamzehei IEEE ICASSP - April 22 2015

slide-2
SLIDE 2

x =

  • n

ψnθn

x = Ψθ

θ = Ψ−1

Concise Signal Structure

RN

sparse
 signal nonzero entries

⊆ ΣK

θ

  • Sparse signal: only K out of N 


coefficients are nonzero

– model: union of K-dimensional subspaces
 aligned w/ coordinate axes

x =

slide-3
SLIDE 3

From Samples to Measurements

  • Replace samples by more general encoder 


based on a few linear projections (inner products)
 
 
 , x is sparse

measurements sparse
 signal information rate

y x

slide-4
SLIDE 4
  • Integrates sparsity/CS with parameter estimation
  • Parametric dictionaries (PDs) collect observations for

a set of values of parameter of interest (one per column)


  • Simple signals (e.g., few localization targets) can be

expressed via PDs using sparse coefficient vectors

Parametric Dictionaries for Sparsity

[Gorodntisky and Rao 1997] [Malioutov, Cetin, Willsky 2005] [Cevher, Duarte, Baraniuk 2008] [Cevher, Gurbuz, McClellan, Chellapa 2008][...]

slide-5
SLIDE 5
  • “Retrofitting” sparsity in CS

for common parameter estimation problems (spectral estimation, localization, bearing estimation) results in several issues: dictionary coherence, basis/discretization mismatch, suboptimal sparsity…

  • Need new signal models that

rely on continuous parameter space and are widely applicable

Parameter Estimation in Compressive Sensing

y x

[Strohmer and Herman 2012] [Chi, Pezeshki, Scharf, Calderbank 2012] [Liao and Fannjiang 2012] [Duarte and Baraniuk 2013]

slide-6
SLIDE 6

Parameter Estimation in Compressive Sensing

y x

[Strohmer and Herman 2012] [Chi, Pezeshki, Scharf, Calderbank 2012] [Liao and Fannjiang 2012] [Duarte and Baraniuk 2013]

  • “Retrofitting” sparsity in CS

for common parameter estimation problems (spectral estimation, localization, bearing estimation) results in several issues: dictionary coherence, basis/discretization mismatch, suboptimal sparsity…

  • Need new signal models that

rely on continuous parameter space and are widely applicable

slide-7
SLIDE 7

Parametric Signals and Basis Mismatch

DTFT On DFT Grid Off DFT Grid

[Herman and Strohmer 2010][Chi, Scharf, Pezeshki, and Calderbank 2011]

slide-8
SLIDE 8

Resolution in Frequency Domain

, c = 10

  • Redundant Fourier Frame

T

  • Increased resolution allows for more scenes to

be formulated as sparse in parametric dictionary

slide-9
SLIDE 9
  • I

n i t i a l i z e :

  • W

h i l e h a l t i n g c r i t e r i

  • n

f a l s e ,

  • (

e s t i m a t e s i g n a l )

  • (
  • b

t a i n b e s t s p a r s e a p p r

  • x

. )

  • (

c a l c u l a t e r e s i d u a l )

  • R

e t u r n e s t i m a t e

Inputs:

  • Measurement vector y
  • Measurement matrix
  • Sparsity K

Standard Sparse Signal Recovery

Iterative Hard Thresholding

Output:

  • PD coefficient estimate

[Blumensath and Davies 2009]

slide-10
SLIDE 10

, c = 10

0.02 0.04 0.06 0.08 0.1 0.12 0.5 1 ω ω ω ω

Sparse signal 
 recovery 
 resembles
 matched 
 filtering: Dirichlet Kernel

  • Redundant Fourier Frame

Resolution in Frequency Domain

Coherence

slide-11
SLIDE 11
  • If x is K-structured frequency-sparse, then there exists a

K-sparse vector such that and the nonzeros in

are spaced apart from each other (band exclusion).

Structured Frequency-Sparse Signals

  • A K-structured PD-sparse

signal f consists of K PD elements that are mutually incoherent:

RN

if

0.02 0.04 0.06 0.08 0.1 0.12 0.5 1 ω ω ω ω

[Duarte and Baraniuk, 2012] [Fannjiang and Liao, 2012]

slide-12
SLIDE 12
  • I

n i t i a l i z e :

  • W

h i l e h a l t i n g c r i t e r i

  • n

f a l s e ,

  • (

e s t i m a t e s i g n a l )

  • (
  • b

t a i n b e s t s p a r s e a p p r

  • x

. )

  • (

c a l c u l a t e r e s i d u a l )

  • R

e t u r n e s t i m a t e

Inputs:

  • Measurement vector y
  • Measurement matrix
  • Sparsity K

Standard Sparse Signal Recovery

Iterative Hard Thresholding

Output:

  • PD coefficient estimate

[Blumensath and Davies 2009]

slide-13
SLIDE 13
  • I

n i t i a l i z e :

  • W

h i l e h a l t i n g c r i t e r i

  • n

f a l s e ,

  • (

e s t i m a t e s i g n a l )

  • (
  • b

t a i n b a n d

  • e

x c l u d i n g s p a r s e a p p r

  • x

. )

  • (

c a l c u l a t e r e s i d u a l )

  • R

e t u r n e s t i m a t e

Output:

  • PD coefficient estimate

Structured Sparse Signal Recovery

Band-Excluding IHT

Inputs:

  • Measurement vector y
  • Measurement matrix
  • Structured sparse approx.

algorithm

[Duarte and Baraniuk, 2012] [Fannjiang and Liao, 2012]

Can be applied to a variety of greedy algorithms 
 (CoSaMP, OMP, Subspace Pursuit, etc.)

slide-14
SLIDE 14
  • I

n i t i a l i z e :

  • W

h i l e h a l t i n g c r i t e r i

  • n

f a l s e ,

  • (

e s t i m a t e s i g n a l )

  • (
  • b

t a i n b e s t s p a r s e a p p r

  • x

. )

  • (

c a l c u l a t e r e s i d u a l )

  • R

e t u r n e s t i m a t e

Inputs:

  • Measurement vector y
  • Measurement matrix
  • Sparsity K

Standard Sparse Signal Recovery

Iterative Hard Thresholding

Output:

  • PD coefficient estimate

[Blumensath and Davies 2009]

slide-15
SLIDE 15
  • I

n i t i a l i z e :

  • W

h i l e h a l t i n g c r i t e r i

  • n

f a l s e ,

  • (

e s t i m a t e s i g n a l )

  • (
  • b

t a i n b e s t s p a r s e a p p r

  • x

. ) 


( O n s a g e r c

  • r

r e c t i

  • n

t e r m )

  • R

e t u r n e s t i m a t e

Inputs:

  • Measurement vector y
  • Measurement matrix
  • Sparsity K

Standard Sparse Signal Recovery

Approximate Message Passing (AMP)

Output:

  • PD coefficient estimate

[Donoho, Maleki, and Montanari 2009]

Onsager correction term shapes b to resemble input signal

x in Gaussian noise

slide-16
SLIDE 16
  • AMP: based on message passing algorithms
  • Onsager correction term “shapes” the

distribution of the signal estimate b to resemble the

  • riginal signal embedded in additive white Gaussian noise



 
 


  • Intuition: hard thresholding provides optimal denoising

for sparse signals embedded in additive white Gaussian noise

The Power of 
 Approximate Message Passing

[Donoho, Maleki, and Montanari 2009] [Metzler, Maleki, and Baraniuk 2014]

: Divergence of hard thresholding
 (sum of dimension-wise derivatives)

slide-17
SLIDE 17
  • AMP algorithm can be extended to arbitrary signal models

by using optimal denoising algorithm for signals in additive Gaussian noise


  • Examples: hard thresholding for sparse signals; block

thresholding for block-sparse signals; total variation denoisers for piecewise constant signals; image denoising algorithms


  • “Flexible” AMP requires formulation of new Onsager

correction term specific to denoiser applied

  • Can also estimate numerically via Monte Carlo iterations



 


with ; average over multiple draws of b with small to obtain numerical estimate of expected value.

The Flexibility of AMP

[Donoho, Johnstone, and Montanari 2013] [Metzler, Maleki, and Baraniuk 2014] [Tan, Ma, and Baron 2015] [Metzler, Maleki, and Baraniuk 2014]

slide-18
SLIDE 18
  • Statistical parameter estimation algorithms can be

paired with generative signal models to provide “parametric denoisers”

  • Rich literature in statistical parameter estimation for a

multitude of problems (including line spectral estimation)

  • Estimate Onsager correction term numerically:



 
 
 ; average over multiple realizations of b with a small value of (e.g., ) to obtain numerical estimate of expected value.

  • In practice, 1-2 iterations often suffice.

AMP with Parametric Denoisers

[Metzler, Maleki, and Baraniuk 2014]

slide-19
SLIDE 19

Inputs:

  • Noisy observation x
  • Target sparsity K

Line Spectral Estimation-Based 
 Denoising Algorithm Output:

  • Parameter estimates
  • Denoised signal

MUSIC Root MUSIC ESPRIT PHD

...

x K

Parametric Denoising via Line Spectral Estimation

slide-20
SLIDE 20
  • I

n i t i a l i z e :

  • W

h i l e h a l t i n g c r i t e r i

  • n

f a l s e ,

  • (

e s t i m a t e s i g n a l )

  • (
  • b

t a i n L S E p a r a m e t r i c s p a r s e a p p r

  • x

. )

  • (

c a l c u l a t e r e s i d u a l )

  • R

e t u r n e s t i m a t e

Inputs:

  • Measurement vector y
  • Measurement matrix
  • Structured sparse approx.

algorithm

Output:

  • PD coefficient estimate

Prior Use of Denoisers as Sparse Approximation Algorithms

IHT + Line Spectral Estimation

[Duarte and Baraniuk, 2012]

slide-21
SLIDE 21
  • I

n i t i a l i z e :

  • W

h i l e h a l t i n g c r i t e r i

  • n

f a l s e ,

  • (

e s t i m a t e s i g n a l )

  • (
  • b

t a i n f r e q u e n c y e s t i m a t e s ) 


( n u m e r i c a l e s t i m a t i

  • n
  • f

O n s a g e r c

  • r

r e c t i

  • n

t e r m )

  • R

e t u r n e s t i m a t e

Inputs:

  • Measurement vector y
  • Measurement matrix
  • Sparsity K

Denoising via Line Spectral Estimation

AMP + Line Spectral Estimation

Output:

  • PD coefficient estimate

[Hamzehei and Duarte 2015]

slide-22
SLIDE 22

Phase Transition Diagram for
 Compressive Line Spectral Estimation

0.2 0.4 0.6 0.8 1 0.05 0.1 0.15 0.2 δ = M/N ρ = K/M AMP −> MUSIC l1−min. −> MUSIC BISP IHT+MUSIC AMP+MUSIC

Success:
 Average frequency estimation error < 1 Hz

  • ver 100

trials

N = 512

BISP: Band- Excluding Interpolating Subspace Pursuit [Fyhn, Duarte, Jensen 2014]

slide-23
SLIDE 23

100 200 300 400 500 10 20 30 40 50 60 70 Number of Measurements Average Frequency Estimation Error, Hz AMP −> MUSIC l1−min. −> MUSIC BISP IHT+MUSIC AMP+MUSIC

Phase Transition Diagram for
 Compressive Line Spectral Estimation

N = 512 K = 8 100 trials

slide-24
SLIDE 24

Conclusions

  • Approximate Message Passing is flexible enough to

be extended from compressive sensing (signal recovery) to compressive parameter estimation

  • Leverage existing statistical estimation algorithms as

“parametric denoisers” within AMP

  • Sidestep discretization issues implicit in the use of

parametric dictionaries and parameter tuning issues from structured sparsity models

  • Additional computation in Onsager term estimation
  • Future work: theoretical analysis (state evolution,

denoiser analysis, measurement bounds…)

  • Many other example applications: bearing

estimation, time delay estimation, localization…

http://www.ecs.umass.edu/~mduarte mduarte@ecs.umass.edu