fast compressive sampling using fast compressive sampling
play

Fast Compressive Sampling Using Fast Compressive Sampling Using - PowerPoint PPT Presentation

Fast Compressive Sampling Using Fast Compressive Sampling Using Structurally Random Matrices Presented by: Thong Do (thongdo@jhu.edu) g ( g @j ) The Johns Hopkins University A joint work with Prof. Trac Tran , The Johns Hopkins University


  1. Fast Compressive Sampling Using Fast Compressive Sampling Using Structurally Random Matrices Presented by: Thong Do (thongdo@jhu.edu) g ( g @j ) The Johns Hopkins University A joint work with Prof. Trac Tran , The Johns Hopkins University f T T P Th J h H ki U i it 1 Dr. Lu Gan , Brunel University, UK

  2. Compressive Sampling Framework Compressive Sampling Framework Main assumption: • K- sparse representation of an input signal Ψ : sparsifying transform; Ψ : sparsifying transform; = Ψ α x α: transform coefficients; × × × N 1 N N N 1 • Compressive sampling: = Φ Φ y x × × × M 1 M N N 1 Φ – : random matrices, random row subset of an orthogonal × M N matrix (partial Fourier),etc. Φ Ψ α y × × × × N 1 M N N N 1 M α has K nonzero × N 1 Sensing matrix Sensing matrix e t es entries = Compressed C d measurements Reconstruction: L1-minimization (Basis Pursuit) • � = ΦΨ α α = α y arg min s.t. 1 α � � 2 = Ψ α x

  3. A Wish-list of the Sampling Operator A Wish list of the Sampling Operator • Optimal Performance : Optimal Performance : – require the minimal number of compressed measurements measurements • Universality: – incoherent with various families of signals incoherent with various families of signals • Practicality: – fast computation f i – memory efficiency – hardware friendly – streaming capability 3

  4. Current Sensing Matrices Current Sensing Matrices • Random matrices[Candes, Tao, Donoho] � Optimal performance Huge memory and computational complexity Not appropriate in large scale applications • Partial Fourier[Candes et. al.] � Fast computation Non-universality • Only incoherent with signals sparse in time O l i h t ith i l i ti • Not incoherent with smooth signals such as natural images. • Other methods (Scrambled FFT Random Other methods (Scrambled FFT, Random Filters,…) Either lack of universality or no theoretical guarantee y g 4

  5. Motivation Motivation • Significant performance improvement of scrambled FFT over partial Fourier bl d i l i � A well-known fact [Baraniuk, Candès] Original 512x512 Lena image [ ] But no theoretical justification Compressed p Random measurements FFT downsampler Reconstruction from 25% of measurements: 16.5 dB Reconstruction Reconstruction Basis Pursuit 5

  6. Motivation Motivation • Significant performance improvement of scrambled FFT over partial Fourier bl d i l i � A well-known fact [Baraniuk, Candès] Original 512x512 Lena image [ ] But no theoretical justification Compressed p Scrambled Random measurements FFT downsampler Reconstruction from 25% of measurements: 29.4 dB Reconstruction Reconstruction Basis Pursuit 6

  7. Our Contributions Our Contributions • Propose the concept of Structurally random • Propose the concept of Structurally random ensembles – Extension of Scrambled Fourier Ensemble E t i f S bl d F i E bl • Provide theoretical guarantee for this novel sensing framework • Design sensing ensembles with practical g g p features – Fast computable memory efficient hardware Fast computable, memory efficient, hardware friendly, streaming capability etc. 7

  8. Proposed CS System Proposed CS System • Pre-randomizer: • Global randomizer : random permutation of sample indices p • Local randomizer : random sign reversal of sample values sample values Compressed Input signal measurements Random FFT,WHT,DCT Pre-randomizer downsampler downsampler signal recovery Reconstruction Basis Pursuit 8

  9. Proposed CS System Proposed CS System • Compressive Sampling = = y D T P x ( ( ( ))) A x ( ) • Pre-randomize an input signal ( P ) • Apply fast transform ( T ) pp y ( ) • Pick up a random subset of transform coefficients ( D ) • Reconstruction • Basis Pursuit with sensing operator and its adjoint: B i P it ith i t d it dj i t = Ψ • = Ψ • A D T P ( ( ( ( )))) * * * * * A ( P T ( ( D ( ))))) Compressed Input signal measurements D T P = Ψ = Ψ α α x x Random downsampler Random downsampler FFT,WHT,DCT,… FFT WHT DCT Pre randomizer Pre-randomizer signal recovery Reconstruction Basis Pursuit 9

  10. Proposed CS System Proposed CS System • Compressive Sampling = = y D T P x ( ( ( ))) A x ( ) • Pre-randomize an input signal ( P ) • Apply fast transform ( T ) pp y ( ) • Pick up a random subset of transform coefficients ( D ) Structurally • Reconstruction random matrix • Basis Pursuit with sensing operator and its adjoint: B i P it ith i t d it dj i t = Ψ • = Ψ • A D T P ( ( ( ( )))) * * * * * A ( P T ( ( D ( ))))) Compressed Input signal measurements D T P = Ψ = Ψ α α x x Random downsampler Random downsampler FFT WHT DCT FFT,WHT,DCT,… Pre-randomizer Pre randomizer signal recovery Reconstruction Basis Pursuit 10

  11. Structurally Random Matrices Structurally Random Matrices • Structurally random matrices with local • Structurally random matrices with local randomizer: a product of 3 matrices Random downsampler = = − Fast transform Fast transform Pr( Pr( 0) 0) 1 1 d d M N M N ii = = FFT, WHT, DCT,… Local randomizer Pr( d 1) M N ii d ′ = ± = Pr( ( 1) ) 1/ 2 ii ii 11

  12. Structurally Random Matrices Structurally Random Matrices • Structurally random matrices with global y g randomizer: a product of 3 matrices Random downsampler Fast transform = = − Global randomizer Global randomizer Pr( Pr( d d 0) 0) 1 1 M N M N ii FFT, WHT, DCT,… = = Uniformly random Pr( d 1) M N ii permutation matrix p Partial Fourier 12

  13. Sparse Structurally Random Matrices Sparse Structurally Random Matrices • With local randomizer: • Fast computation • Memory efficiency • Hardware friendly • Streaming capability g p y Block-diagonal WHT, Local randomizer Random downsampler p d ′ d = ± ± = DCT, FFT, etc. DCT FFT etc Pr( Pr( 1) 1) 1/ 2 1/ 2 = = − ii Pr( d 0) 1 M N ii = = Pr( 1) d M N ii 13

  14. Sparse Structurally Random Matrices Sparse Structurally Random Matrices • With global randomizer • Fast computation • Memory efficiency • Hardware friendly • Nearly streaming capability y g p y Global randomizer Block-diagonal WHT, Random downsampler p Uniformly random f y DCT FFT etc DCT, FFT,etc. = = − Pr( d 0) 1 M N permutation matrix ii = = Pr( 1) d M N ii 14

  15. Theoretical Analysis Theoretical Analysis • Theorem 1 : Assume that the maximum absolute Φ entries of a structurally random matrix and an × M N Ψ orthonormal matrix is not larger than 1 log N × N N Φ Φ Ψ Ψ With high probability, coherence of and is Wi h hi h b bili h f d i × × M N N N Ο not larger than ( log N s / ) Φ Φ – s : the average number of nonzero entries per row of th b f t i f × M N • Proof : – Bernstein’s concentration inequality of sum of independent random variables • The optimal coherence (Gaussian/Bernoulli random Th ti l h (G i /B lli d Ο matrices): ( log N / N ) 15

  16. Theoretical Analysis y • Theorem 2 : With the previous assumption, sampling a signal using a structurally random matrix guarantees a signal using a structurally random matrix guarantees exact reconstruction (by Basis Pursuit) with high probability, provided that p y, p 2 ~ ( / )log M KN s N – s : the average number of nonzero entries per row of the sampling matrix – N: length of the signal, – K: sparsity of the signal K: sparsity of the signal • Proof: – follow the proof framework of [Candès2007] and previous p [ ] p theorem of coherence of the structurally random matrices • The optimal number of measurements required by Gaussian/Bernoulli dense random matrices: Ga ssian/Berno lli d random matrices: K K log l N N 16 [Candès2007] E. Candès and J. Romberg, “Sparsity and incoherence in compressive sampling”, Inverse Problems , 23(3) pp. 969-985, 2007

  17. Simulation Results: Sparse 1D Signals p g • Input signal sparse in DCT p g p domain – N =256, K = 30 • Reconstruction: R t ti – Orthogonal Matching Pursuit(OMP) • WHT256 + global randomizer WHT256 + global randomizer – Random permutation of samples indices + Walsh-Hadamard • WHT256 + local randomizer WHT256 + l l d i – Random sign reversal of sample values + Walsh-Hadamard � The fraction of � h f i f • WHT8 + global randomizer nonzero entries:1/32 – Random permutation of samples indices � 32 times sparser than × 8 8 + block diagonal Walsh- S Scrambled FFT, i.i.d bl d FFT i i d Hadamard Gaussian ensemble,… 17

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend