compressive parameter estimation via approximate message
play

Compressive Parameter Estimation via Approximate Message Passing - PowerPoint PPT Presentation

Compressive Parameter Estimation via Approximate Message Passing Marco F. Duarte Joint work with Shermin Hamzehei IEEE ICASSP - April 22 2015 Concise Signal Structure Sparse signal: only K out of N n n x = coefficients


  1. Compressive Parameter Estimation via 
 Approximate Message Passing Marco F. Duarte Joint work with Shermin Hamzehei IEEE ICASSP - April 22 2015

  2. Concise Signal Structure � • Sparse signal: only K out of N 
 ψ n θ n x = coefficients are nonzero n – model: union of K -dimensional subspaces 
 x = Ψ θ aligned w/ coordinate axes Ψ − 1 θ = x = R N sparse 
 signal θ nonzero ⊆ Σ K entries

  3. 
 
 From Samples to Measurements • Replace samples by more general encoder 
 based on a few linear projections (inner products) 
 , x is sparse y x sparse 
 measurements signal information rate

  4. Parametric Dictionaries for Sparsity • Integrates sparsity/CS with parameter estimation • Parametric dictionaries (PDs) collect observations for a set of values of parameter of interest (one per column) 
 • Simple signals (e.g., few localization targets) can be expressed via PDs using sparse coefficient vectors [Gorodntisky and Rao 1997] [Malioutov, Cetin, Willsky 2005] [Cevher, Duarte, Baraniuk 2008] [Cevher, Gurbuz, McClellan, Chellapa 2008][...]

  5. Parameter Estimation in Compressive Sensing • “Retrofitting” sparsity in CS for common parameter estimation problems (spectral estimation, localization, bearing y x estimation) results in several issues: dictionary coherence, basis/discretization mismatch, suboptimal sparsity… • Need new signal models that rely on continuous [Strohmer and Herman 2012] parameter space and are [Chi, Pezeshki, Scharf, Calderbank 2012] [Liao and Fannjiang 2012] widely applicable [Duarte and Baraniuk 2013]

  6. Parameter Estimation in Compressive Sensing • “Retrofitting” sparsity in CS for common parameter estimation problems (spectral estimation, localization, bearing y x estimation) results in several issues: dictionary coherence, basis/discretization mismatch, suboptimal sparsity… • Need new signal models that rely on continuous [Strohmer and Herman 2012] parameter space and are [Chi, Pezeshki, Scharf, Calderbank 2012] [Liao and Fannjiang 2012] widely applicable [Duarte and Baraniuk 2013]

  7. Parametric Signals and Basis Mismatch DTFT On DFT Grid Off DFT Grid [Herman and Strohmer 2010][Chi, Scharf, Pezeshki, and Calderbank 2011]

  8. Resolution in Frequency Domain • Redundant Fourier Frame T , c = 10 • Increased resolution allows for more scenes to be formulated as sparse in parametric dictionary

  9. Standard Sparse Signal Recovery Iterative Hard Thresholding Inputs: Output: • Measurement vector y • PD coefficient estimate • Measurement matrix • Sparsity K • I n i t i a l i z e : • W h i l e h a l t i n g c r i t e r i o n f a l s e , • • ( e s t i m a t e s i g n a l ) • ( o b t a i n b e s t s p a r s e a p p r o x . ) • ( c a l c u l a t e r e s i d u a l ) • R e t u r n e s t i m a t e [Blumensath and Davies 2009]

  10. Resolution in Frequency Domain • Redundant Fourier Frame , c = 10 Sparse signal 
 1 Coherence Dirichlet Kernel recovery 
 resembles 
 ω 0.5 matched 
 filtering : 0 0 0.02 0.04 0.06 0.08 0.1 0.12 ω ω ω

  11. Structured Frequency-Sparse Signals • A K -structured PD-sparse R N signal f consists of K PD elements that are mutually incoherent: if • If x is K -structured frequency-sparse, then there exists a K -sparse vector such that and the nonzeros in are spaced apart from each other ( band exclusion ). 1 ω 0.5 0 0 0.02 0.04 0.06 0.08 0.1 0.12 [Duarte and Baraniuk, 2012] [Fannjiang and Liao, 2012] ω ω ω

  12. Standard Sparse Signal Recovery Iterative Hard Thresholding Inputs: Output: • Measurement vector y • PD coefficient estimate • Measurement matrix • Sparsity K • I n i t i a l i z e : • W h i l e h a l t i n g c r i t e r i o n f a l s e , • • ( e s t i m a t e s i g n a l ) • ( o b t a i n b e s t s p a r s e a p p r o x . ) • ( c a l c u l a t e r e s i d u a l ) • R e t u r n e s t i m a t e [Blumensath and Davies 2009]

  13. Structured Sparse Signal Recovery Band-Excluding IHT Inputs: Output: • Measurement vector y • PD coefficient estimate • Measurement matrix • Structured sparse approx. algorithm • I n i t i a l i z e : • W h i l e h a l t i n g c r i t e r i o n f a l s e , • • ( e s t i m a t e s i g n a l ) • ( o b t a i n b a n d - e x c l u d i n g s p a r s e a p p r o x . ) • ( c a l c u l a t e r e s i d u a l ) • R e t u r n e s t i m a t e Can be applied to a variety of greedy algorithms 
 (CoSaMP, OMP, Subspace Pursuit, etc.) [Duarte and Baraniuk, 2012] [Fannjiang and Liao, 2012]

  14. Standard Sparse Signal Recovery Iterative Hard Thresholding Inputs: Output: • Measurement vector y • PD coefficient estimate • Measurement matrix • Sparsity K • I n i t i a l i z e : • W h i l e h a l t i n g c r i t e r i o n f a l s e , • • ( e s t i m a t e s i g n a l ) • ( o b t a i n b e s t s p a r s e a p p r o x . ) • ( c a l c u l a t e r e s i d u a l ) • R e t u r n e s t i m a t e [Blumensath and Davies 2009]

  15. Standard Sparse Signal Recovery Approximate Message Passing (AMP) Inputs: Output: • Measurement vector y • PD coefficient estimate • Measurement matrix • Sparsity K • I n i t i a l i z e : • W h i l e h a l t i n g c r i t e r i o n f a l s e , • • ( e s t i m a t e s i g n a l ) • ( o b t a i n b e s t s p a r s e a p p r o x . ) 
 • 
 ( O n s a g e r c o r r e c t i o n t e r m ) • R e t u r n e s t i m a t e Onsager correction term shapes b to resemble input signal x in Gaussian noise [Donoho, Maleki, and Montanari 2009]

  16. 
 
 
 The Power of 
 Approximate Message Passing • AMP: based on message passing algorithms • Onsager correction term “shapes” the distribution of the signal estimate b to resemble the original signal embedded in additive white Gaussian noise 
 [Donoho, Maleki, and Montanari 2009] : Divergence of hard thresholding 
 (sum of dimension-wise derivatives) • Intuition: hard thresholding provides optimal denoising for sparse signals embedded in additive white Gaussian noise [Metzler, Maleki, and Baraniuk 2014]

  17. 
 
 
 The Flexibility of AMP • AMP algorithm can be extended to arbitrary signal models by using optimal denoising algorithm for signals in additive Gaussian noise 
 [Donoho, Johnstone, and Montanari 2013] • Examples: hard thresholding for sparse signals; block thresholding for block-sparse signals; total variation denoisers for piecewise constant signals; image denoising algorithms 
 [Metzler, Maleki, and Baraniuk 2014] [Tan, Ma, and Baron 2015] • “Flexible” AMP requires formulation of new Onsager correction term specific to denoiser applied • Can also estimate numerically via Monte Carlo iterations 
 with ; average over multiple draws of b with small to obtain numerical estimate of expected value. [Metzler, Maleki, and Baraniuk 2014]

  18. 
 
 
 AMP with Parametric Denoisers • Statistical parameter estimation algorithms can be paired with generative signal models to provide “parametric denoisers” • Rich literature in statistical parameter estimation for a multitude of problems (including line spectral estimation) • Estimate Onsager correction term numerically: 
 ; average over multiple realizations of b with a small value of (e.g., ) to obtain numerical estimate of expected value. • In practice, 1-2 iterations often suffice. [Metzler, Maleki, and Baraniuk 2014]

  19. Parametric Denoising via Line Spectral Estimation Line Spectral Estimation-Based 
 Denoising Algorithm Inputs: • Noisy observation x • Target sparsity K Output: • Parameter estimates • Denoised signal MUSIC x Root MUSIC ESPRIT K PHD ...

  20. Prior Use of Denoisers as Sparse Approximation Algorithms IHT + Line Spectral Estimation Inputs: Output: • Measurement vector y • PD coefficient estimate • Measurement matrix • Structured sparse approx. algorithm • I n i t i a l i z e : • W h i l e h a l t i n g c r i t e r i o n f a l s e , • • ( e s t i m a t e s i g n a l ) • ( o b t a i n L S E p a r a m e t r i c s p a r s e a p p r o x . ) • ( c a l c u l a t e r e s i d u a l ) • R e t u r n e s t i m a t e [Duarte and Baraniuk, 2012]

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend