Fast Compressive Sampling Using Fast Compressive Sampling Using - - PowerPoint PPT Presentation

fast compressive sampling using fast compressive sampling
SMART_READER_LITE
LIVE PREVIEW

Fast Compressive Sampling Using Fast Compressive Sampling Using - - PowerPoint PPT Presentation

Fast Compressive Sampling Using Fast Compressive Sampling Using Structurally Random Matrices Presented by: Thong Do (thongdo@jhu.edu) g ( g @j ) The Johns Hopkins University A joint work with Prof. Trac Tran , The Johns Hopkins University


slide-1
SLIDE 1

Fast Compressive Sampling Using Fast Compressive Sampling Using Structurally Random Matrices

Presented by: Thong Do (thongdo@jhu.edu) g ( g @j ) The Johns Hopkins University

A joint work with P f T T Th J h H ki U i it

1

  • Prof. Trac Tran, The Johns Hopkins University
  • Dr. Lu Gan, Brunel University, UK
slide-2
SLIDE 2

Compressive Sampling Framework Compressive Sampling Framework

  • Main assumption:

K-sparse representation of an input signal Ψ: sparsifying transform;

  • Compressive sampling:

Φ

1 1 N N N N

x α

× × ×

= Ψ

Ψ: sparsifying transform; α: transform coefficients; – : random matrices, random row subset of an orthogonal matrix (partial Fourier),etc.

1 1 M M N N

y x

× × ×

= Φ

M N ×

Φ Sensing matrix

M N ×

Φ

N N ×

Ψ

1 M

y

× 1 N

α

×

has K nonzero entries

1 N

α

×

Sensing matrix

=

C d e t es

  • Reconstruction: L1-minimization (Basis Pursuit)
  • Compressed

measurements

2

1

arg min

α

α α =

s.t.

y α = ΦΨ

  • x

α = Ψ

slide-3
SLIDE 3

A Wish-list of the Sampling Operator A Wish list of the Sampling Operator

  • Optimal Performance:

Optimal Performance:

– require the minimal number of compressed measurements measurements

  • Universality:

incoherent with various families of signals – incoherent with various families of signals

  • Practicality:

f i – fast computation – memory efficiency – hardware friendly – streaming capability

3

slide-4
SLIDE 4

Current Sensing Matrices Current Sensing Matrices

  • Random matrices[Candes, Tao, Donoho]

Optimal performance Huge memory and computational complexity Not appropriate in large scale applications

  • Partial Fourier[Candes et. al.]

Fast computation Non-universality

O l i h t ith i l i ti

  • Only incoherent with signals sparse in time
  • Not incoherent with smooth signals such as natural images.
  • Other methods (Scrambled FFT Random

Other methods (Scrambled FFT, Random Filters,…)

Either lack of universality or no theoretical guarantee

4

y g

slide-5
SLIDE 5

Motivation Motivation

  • Significant performance improvement of

bl d i l i scrambled FFT over partial Fourier

A well-known fact [Baraniuk, Candès] Original 512x512 Lena image

[ ]

But no theoretical justification

Compressed

FFT

Random downsampler p measurements

Reconstruction from 25% of measurements: 16.5 dB

Reconstruction Reconstruction Basis Pursuit

5

slide-6
SLIDE 6

Motivation Motivation

  • Significant performance improvement of

bl d i l i scrambled FFT over partial Fourier

A well-known fact [Baraniuk, Candès] Original 512x512 Lena image

[ ]

But no theoretical justification

Compressed Scrambled

FFT

Random downsampler p measurements

Reconstruction from 25%

  • f measurements: 29.4 dB

Reconstruction Reconstruction Basis Pursuit

6

slide-7
SLIDE 7

Our Contributions Our Contributions

  • Propose the concept of Structurally random
  • Propose the concept of Structurally random

ensembles

E t i f S bl d F i E bl – Extension of Scrambled Fourier Ensemble

  • Provide theoretical guarantee for this novel

sensing framework

  • Design sensing ensembles with practical

g g p features

– Fast computable memory efficient hardware Fast computable, memory efficient, hardware friendly, streaming capability etc.

7

slide-8
SLIDE 8

Proposed CS System Proposed CS System

  • Pre-randomizer:
  • Global randomizer: random permutation of

sample indices p

  • Local randomizer: random sign reversal of

sample values sample values

FFT,WHT,DCT Pre-randomizer Input signal Random downsampler Compressed measurements downsampler Reconstruction signal recovery

8

Basis Pursuit

slide-9
SLIDE 9

Proposed CS System Proposed CS System

  • Compressive Sampling
  • Pre-randomize an input signal (P)
  • Apply fast transform (T)

( ( ( ))) ( ) y D T P x A x = =

pp y ( )

  • Pick up a random subset of transform coefficients (D)
  • Reconstruction

B i P it ith i t d it dj i t

  • Basis Pursuit with sensing operator and its adjoint:

( ( ( ( )))) A D T P = Ψ •

* * * * *

( ( ( ( ))))) A P T D = Ψ

  • T

FFT WHT DCT

P

Pre-randomizer

Input signal D

Random downsampler

Compressed measurements

x α = Ψ

FFT,WHT,DCT,… Pre randomizer Random downsampler

Reconstruction signal recovery

x α = Ψ

9

Basis Pursuit

slide-10
SLIDE 10

Proposed CS System Proposed CS System

  • Compressive Sampling
  • Pre-randomize an input signal (P)
  • Apply fast transform (T)

( ( ( ))) ( ) y D T P x A x = =

pp y ( )

  • Pick up a random subset of transform coefficients (D)
  • Reconstruction

B i P it ith i t d it dj i t

Structurally random matrix

  • Basis Pursuit with sensing operator and its adjoint:

( ( ( ( )))) A D T P = Ψ •

* * * * *

( ( ( ( ))))) A P T D = Ψ

  • T

FFT WHT DCT

P

Pre-randomizer

Input signal D

Random downsampler

Compressed measurements

x α = Ψ

FFT,WHT,DCT,… Pre randomizer Random downsampler

Reconstruction signal recovery

x α = Ψ

10

Basis Pursuit

slide-11
SLIDE 11

Structurally Random Matrices Structurally Random Matrices

  • Structurally random matrices with local
  • Structurally random matrices with local

randomizer: a product of 3 matrices

Random downsampler

Pr( 0) 1 d M N

Fast transform

Pr( 1)

ii

d M N = = Pr( 0) 1

ii

d M N = = −

Local randomizer

Pr( 1) 1/ 2

ii

d′ = ± =

Fast transform FFT, WHT, DCT,…

( )

ii 11

slide-12
SLIDE 12

Structurally Random Matrices Structurally Random Matrices

  • Structurally random matrices with global

y g randomizer: a product of 3 matrices

Global randomizer Random downsampler

Pr( 0) 1 d M N

Fast transform Global randomizer Uniformly random permutation matrix

Pr( 1)

ii

d M N = = Pr( 0) 1

ii

d M N = = −

FFT, WHT, DCT,… p

12

Partial Fourier

slide-13
SLIDE 13

Sparse Structurally Random Matrices Sparse Structurally Random Matrices

  • With local randomizer:
  • Fast computation
  • Memory efficiency
  • Hardware friendly
  • Streaming capability

g p y

Random downsampler Local randomizer Block-diagonal WHT, DCT FFT etc

Pr( 1) 1/ 2 d′ ±

p

Pr( 1)

ii

d M N = = Pr( 0) 1

ii

d M N = = −

DCT, FFT, etc.

Pr( 1) 1/ 2

ii

d = ± =

13

slide-14
SLIDE 14

Sparse Structurally Random Matrices Sparse Structurally Random Matrices

  • With global randomizer
  • Fast computation
  • Memory efficiency
  • Hardware friendly
  • Nearly streaming capability

y g p y

Global randomizer Uniformly random Random downsampler Block-diagonal WHT, DCT FFT etc f y permutation matrix p

Pr( 1)

ii

d M N = = Pr( 0) 1

ii

d M N = = −

DCT, FFT,etc.

14

slide-15
SLIDE 15

Theoretical Analysis Theoretical Analysis

  • Theorem 1: Assume that the maximum absolute

entries of a structurally random matrix and an

  • rthonormal matrix is not larger than

Wi h hi h b bili h f d i

Ψ Φ

M N ×

Φ

N N ×

Ψ

1 log N

With high probability, coherence of and is not larger than

th b f t i f

( log / ) N s Ο

Φ

N N ×

Ψ

M N ×

Φ

– s: the average number of nonzero entries per row of

  • Proof:

M N ×

Φ

– Bernstein’s concentration inequality of sum of independent random variables

Th ti l h (G i /B lli d

  • The optimal coherence (Gaussian/Bernoulli random

matrices): ( log

/ ) N N Ο

15

slide-16
SLIDE 16

Theoretical Analysis y

  • Theorem 2: With the previous assumption, sampling

a signal using a structurally random matrix guarantees a signal using a structurally random matrix guarantees exact reconstruction (by Basis Pursuit) with high probability, provided that p y, p

– s: the average number of nonzero entries per row of the

2

~ ( / )log M KN s N

sampling matrix – N: length of the signal, – K: sparsity of the signal K: sparsity of the signal

  • Proof:

– follow the proof framework of [Candès2007] and previous p [ ] p theorem of coherence of the structurally random matrices

  • The optimal number of measurements required by

Ga ssian/Berno lli d random matrices:

l K N

16

Gaussian/Bernoulli dense random matrices:

log K N

[Candès2007] E. Candès and J. Romberg, “Sparsity and incoherence in compressive sampling”, Inverse Problems, 23(3) pp. 969-985, 2007

slide-17
SLIDE 17

Simulation Results: Sparse 1D Signals p g

  • Input signal sparse in DCT

p g p domain

– N=256, K = 30

R t ti

  • Reconstruction:

– Orthogonal Matching Pursuit(OMP)

  • WHT256 + global randomizer

WHT256 + global randomizer

– Random permutation of samples indices + Walsh-Hadamard

WHT256 + l l d i

  • WHT256 + local randomizer

– Random sign reversal of sample values + Walsh-Hadamard

  • h f

i f

  • WHT8 + global randomizer

– Random permutation of samples indices + block diagonal Walsh-

8 8 ×

The fraction of nonzero entries:1/32 32 times sparser than S bl d FFT i i d

17

Hadamard Scrambled FFT, i.i.d Gaussian ensemble,…

slide-18
SLIDE 18

Simulation Results: Compressible 2D Signals p g

  • Experiment set-up:

Test images: 512×512 Lena and Boat; – Test images: 512×512 Lena and Boat; – Sparsifying transform ψ: Daubechies 9/7 wavelet transform; L1 minimi ation sol er: GPSR [Mario Robert Stephen] – L1-minimization solver: GPSR [Mario, Robert, Stephen] – Structually random sensing matrices Φ:

  • WHT512 & local randomizer

– Random sign reversal of sample values + 512×512 block diagonal Hadamard transform; – Full streaming capability

  • WHT32 & global randomizer

– Random permutation of sample indices + 32×32 block diagonal Hadamard transform; – Highly sparse: The fraction of nonzero entries is only 1/213

18

– Highly sparse: The fraction of nonzero entries is only 1/2

slide-19
SLIDE 19

Rate-Distortion Performance: Lena

P i l FFT i l

Full

Partial FFT in wavelet domain: – transform the image into wavelet coeffs

streaming capability 8000 times sparser than

into wavelet coeffs – sense these coeffs (rather than sense directly image pixels) using partial FFT

sparser than the Scrambled FFT

FFT No universality More computational complexity p y Serves as a benchmark

19

R-D performance of 512x512 Lena

slide-20
SLIDE 20

Reconstructed Images: Lena

  • Reconstruction from 25% of measurements using GPSR

Original 512x512 Lena image Partial FFT:16 dB Partial FFT in wavelet domain: 30.1 dB Original 512x512 Lena image Partial FFT:16 dB WHT512 + local WHT32 + global Scrambled FFT:29 3 dB

20

WHT512 + local randomizer: 28.4 dB WHT32 + global randomizer: 29 dB Scrambled FFT:29.3 dB

slide-21
SLIDE 21

Rate-Distortion Performance: Boat Rate Distortion Performance: Boat

21

R-D performance of 512x512 Boat image

slide-22
SLIDE 22

Future Research Future Research

  • Develop theoretical analysis of structurally random

p y y matrices with greedy, iterative reconstruction algorithms such as OMP

  • Application of structurally random matrices to high

dimensionality reduction

– Fast Johnson-Lindenstrauss transform using structurally random matrices

Cl t d t i i ti i li

  • Closer to deterministic compressive sampling

– Replace a random downsampler by a deterministic downsampler or a deterministic lattice of measurements* downsampler or a deterministic lattice of measurements – Develop theoretical analysis for this nearly deterministic framework

22

* Basarab Matei and Yves Meyer, “A variant of the compressed sensing of Emmanuel Candès ”, Preprint, 2008.

slide-23
SLIDE 23

Conclusions Conclusions

  • Structurally random matrices

– Fast computable; – Memory efficient:

Hi hl l i d i f bl k

  • Highly sparse solution: random permutation ⇒ fast block

diagonal transform ⇒ random sampling;

– Streaming capability: g p y

  • Solution with full streaming capability: random sign flipping

⇒ fast block diagonal transform ⇒ random sampling;

Hardware friendly; – Hardware friendly; – Performance:

  • Nearly optimal theoretical bounds;

Nearly optimal theoretical bounds;

  • Numerical simulations: comparable with completely random

matrices

23

slide-24
SLIDE 24

References References

  • E. Candès and J. Romberg, “Sparsity and incoherence in compressive sampling”, Inverse Problems,

23(3) pp. 969-985, 2007 E C dè J R b d T T “R b i i i l E i l i f

  • E. Candès, J. Romberg, and T. Tao, “Robust uncertainty principles:Exact signal reconstruction from

highly incomplete frequency information,” IEEE Trans. on Information Theory, vol. 52, pp. 489 – 509,

  • Feb. 2006.
  • E. Candès, J. Romberg, and T. Tao, “Stable signal recovery from incomplete and inaccurate

measurements,” Communications on Pure and Applied Mathematics, vol. 59, pp. 1207–1223, Aug. 2006.

  • M. F. Duarte, M. B. Wakin, and R. G. Baraniuk, “Fast reconstruction of piecewise smooth signals from

incoherent projections,” SPARS’05, Rennes, France, Nov 2005.

  • Basarab Matei and Yves Meyer “A variant of the compressed sensing of Emmanuel Candès ” Preprint

Basarab Matei and Yves Meyer, A variant of the compressed sensing of Emmanuel Candès , Preprint, 2008.

  • Mário A. T. Figueiredo, Robert D. Nowak, and Stephen J. Wright, “Gradient projection for sparse

reconstruction: Application to compressed sensing and other inverse problems”, IEEE Journal of Selected Topics in Signal Processing: Special Issue on Convex Optimization Methods for Signal Processing 1(4) Topics in Signal Processing: Special Issue on Convex Optimization Methods for Signal Processing, 1(4),

  • pp. 586-598, 2007.

24

slide-25
SLIDE 25
  • thanglong ece jhu edu/CS/fast cs SRM rar
  • thanglong.ece.jhu.edu/CS/fast_cs_SRM.rar

25