Com pressive Sensing for High-Dim ensional Data Richard Baraniuk - - PowerPoint PPT Presentation

com pressive sensing
SMART_READER_LITE
LIVE PREVIEW

Com pressive Sensing for High-Dim ensional Data Richard Baraniuk - - PowerPoint PPT Presentation

Com pressive Sensing for High-Dim ensional Data Richard Baraniuk Rice University dsp.rice.edu/ cs DIMACS Workshop on Recent Advances in Mathematics and Information Sciences for Analysis and Understanding of Massive and Diverse Sources of


slide-1
SLIDE 1

Richard Baraniuk

Rice University dsp.rice.edu/ cs

DIMACS Workshop on Recent Advances in Mathematics and Information Sciences for Analysis and Understanding of Massive and Diverse Sources of Data

Com pressive Sensing

for High-Dim ensional Data

slide-2
SLIDE 2

Pressure is on DSP

  • Increasing pressure on signal/ image processing

hardware and algorithms to support higher resolution / denser sampling

» ADCs, cameras, imaging systems, …

X large num bers of sensors

» multi-view target data bases, camera arrays and networks, pattern recognition systems,

X increasing num bers of m odalities

» acoustic, seismic, RF, visual, IR, SAR, …

=

deluge of data deluge of data

» how to acquire, store, fuse, process efficiently?

slide-3
SLIDE 3

Data Acquisition

  • Time:

A/ D converters, receivers, …

  • Space:

cameras, imaging systems, …

  • Foundation: Shannon sam pling theorem

– Nyquist rate: must sample at 2x highest frequency in signal

N periodic

samples

slide-4
SLIDE 4

Sensing by Sampling

  • Long-established paradigm for digital data acquisition

– sam ple data (A-to-D converter, digital camera, … ) – com press data (signal-dependent, nonlinear)

com press transmit/ store receive decompress sample

sparse wavelet transform

slide-5
SLIDE 5

Sparsity / Compressibility

pixels large wavelet coefficients wideband signal samples large Gabor coefficients

  • Number of samples N often too large, so com press

– transform coding: exploit data sparsity/ compressibility in some representation (ex: orthonormal basis)

slide-6
SLIDE 6

Compressive Data Acquisition

  • When data is sparse/ compressible, can directly

acquire a condensed representation with no/ little information loss through dim ensionality reduction

measurements sparse signal sparse in some basis

slide-7
SLIDE 7

Compressive Data Acquisition

  • When data is sparse/ compressible, can directly

acquire a condensed representation with no/ little information loss

  • Random projection will work

measurements sparse signal sparse in some basis

slide-8
SLIDE 8

Compressive Data Acquisition

  • When data is sparse/ compressible, can directly

acquire a condensed representation with no/ little information loss

  • Random projection preserves information

– Johnson-Lindenstrauss Lemma (point clouds, 1984) – Compressive Sensing (CS) (sparse and compressible signals, Candes-Romberg-Tao, Donoho, 2004) reconstruct project

slide-9
SLIDE 9

Why Does It Work (1)?

  • Random projection not full rank, but stably em beds

– sparse/ compressible signal models (CS) – point clouds (JL)

into lower dimensional space with high probability

  • Stable embedding: preserves structure

– distances between points, angles between vectors, …

provided M is large enough: Com pressive Sensing K-dim planes

K-sparse

model

slide-10
SLIDE 10

Why Does It Work (2)?

  • Random projection not full rank, but stably em beds

– sparse/ compressible signal models (CS) – point clouds (JL)

into lower dimensional space with high probability

  • Stable embedding: preserves structure

– distances between points, angles between vectors, …

provided M is large enough: Johnson-Lindenstrauss

Q points

slide-11
SLIDE 11

CS Hallmarks

  • CS changes the rules of the data acquisition game

– exploits a priori signal sparsity information

  • Universal

– same random projections / hardware can be used for any compressible signal class (generic)

  • Dem ocratic

– each measurement carries the same amount of information – simple encoding – robust to measurement loss and quantization

  • Asym m etrical (most processing at decoder)
  • Random projections weakly encrypted
slide-12
SLIDE 12

Example: “Single-Pixel” CS Camera

random pattern on DMD array

DMD DMD

single photon detector im age reconstruction

  • r

processing

slide-13
SLIDE 13

Example Image Acquisition

500 random measurements 4096 pixels

slide-14
SLIDE 14
  • For real-tim e, stream ing use, can have

banded structure

  • Can implement in analog hardware

pseudo-random code

Analog-to-Information Conversion

slide-15
SLIDE 15

pseudo-random code

Analog-to-Information Conversion

radar chirps w/ narrowband interference signal after AIC

  • For real-tim e, stream ing use, can have

banded structure

  • Can implement in analog hardware
slide-16
SLIDE 16

Information Scalability

  • If we can reconstruct a signal from compressive

measurements, then we should be able to perform

  • ther kinds of statistical signal processing:

– detection – classification – estim ation …

slide-17
SLIDE 17

Multiclass Likelihood Ratio Test

  • Observe one of P known signals in noise
  • Classify according to:
  • AWGN: nearest-neighbor classification
slide-18
SLIDE 18

Compressive LRT

  • Compressive observations:

by the JL Lemma these distances are preserved (* )

[ Waagen et al 05; RGB, Davenport et al 06; Haupt et al 06]

slide-19
SLIDE 19

Matched Filter

  • In many applications, signals are transform ed

with an unknown parameter; ex: translation

  • Elegant solution: m atched filter

Compute for all Challenge: Extend compressive LRT to accommodate param eterized signal transform ations

slide-20
SLIDE 20

Generalized Likelihood Ratio Test

  • Matched filter is a special case of the GLRT
  • GLRT approach extends to any case where

each class can be param eterized with K parameters

  • If mapping from parameters to signal is

well-behaved, then each class forms a m anifold in

slide-21
SLIDE 21

What is a Manifold?

“Manifolds are a bit like pornography: hard to define, but you know one when you see one.”

– S. Weinberger [ Lee]

  • Locally Euclidean topological space
  • Roughly speaking:

– a collection of mappings of open sets of RK glued together (“coordinate charts”) – can be an abstract space, not a subset

  • f Euclidean space

e.g., SO3, Grassmannian

  • Typically for signal processing:

– nonlinear K-dimensional “surface” in signal space RN

slide-22
SLIDE 22

Object Rotation Manifold

K= 1

slide-23
SLIDE 23

Up/ Down Left/ Right Manifold

[ Tenenbaum, de Silva, Langford]

K= 2

slide-24
SLIDE 24

Manifold Classification

  • Now suppose data is drawn from
  • ne of P possible manifolds:
  • AWGN: nearest m anifold classification

M1 M2 M3

slide-25
SLIDE 25

Compressive Manifold Classification

  • Compressive observations:

?

slide-26
SLIDE 26

Compressive Manifold Classification

  • Compressive observations:
  • Good new s: structure of smooth

manifolds is preserved by random projection provided

– distances, geodesic distance, angles, …

[ RGB and Wakin, 06]

slide-27
SLIDE 27

Stable Manifold Embedding

Theorem : Let F ⊂ RN be a compact K-dimensional manifold with

– condition number 1/ τ (curvature, self-avoiding) – volume V

Then with probability at least 1-ρ, the following statement holds: For every pair x,y ∈ F, Let Φ be a random MxN orthoprojector with [ Wakin et al 06]

slide-28
SLIDE 28

Manifold Learning from Compressive Measurements

I SOMAP HLLE Laplacian Eigenm aps

R4 0 9 6 RM

M= 1 5 M= 1 5 M= 2 0

slide-29
SLIDE 29

The Smashed Filter

  • Com pressive m anifold classification with GLRT

– nearest-manifold classifier based on manifolds M1 M2 M3 Φ M1 Φ M2 Φ M3

slide-30
SLIDE 30

Let Φ be a random MxN orthoprojector with

Multiple Manifold Embedding

Corollary: Let M1, … ,MP ⊂ RN be compact K-dimensional manifolds with

– condition number 1/ τ (curvature, self-avoiding) – volume V – min dist(Mj,Mk) > τ (can be relaxed)

Then with probability at least 1-ρ, the following statement holds: For every pair x,y ∈ U Mj,

slide-31
SLIDE 31

Smashed Filter - Experiments

  • 3 image classes: tank, school bus, SUV
  • N = 64K pixels
  • Imaged using single-pixel CS camera with

– unknown shift – unknown rotation

slide-32
SLIDE 32

Smashed Filter – Unknown Position

  • Image shifted at random (K= 2 manifold)
  • Noise added to measurements

– identify most likely position for each image class – identify most likely class using nearest-neighbor test number of measurements M number of measurements M

  • avg. shift estimate error

classification rate (% ) more noise more noise

slide-33
SLIDE 33

Smashed Filter – Unknown Rotation

  • Training set constructed for each class with

compressive measurements

– rotations at 10o, 20o, … , 360o (K= 1 manifold) – identify most likely rotation for each image class – identify most likely class using nearest-neighbor test

  • Perfect classification with

as few as 6 measurements

  • Good estimates of the

viewing angle with under 10 measurements

number of measurements M

  • avg. rot. est. error
slide-34
SLIDE 34

Conclusions

  • Compressive measurements are

inform ation scalable

reconstruction > estimation > classification > detection

  • Sm ashed filter: dimension-reduced GLRT for

parametrically transformed signals

– exploits compressive measurements and manifold structure – broadly applicable: targets do not have to have sparse representation in any basis – effective for image classification when combined with single-pixel camera

  • Current work

– efficient parameter estimation using multiscale Newton’s method [ Wakin, Donoho, Choi, RGB, 05] – linking continuous manifold models to discrete point cloud models [ Wakin, DeVore, Davenport, RGB, 05] – noise analysis and tradeoffs (M/ N SNR penalty) – compressive k-NN, SVMs, ...

dsp.rice.edu/ cs