Introduction to Compressed Sensing Gitta Kutyniok (Institut f ur - - PowerPoint PPT Presentation

introduction to compressed sensing
SMART_READER_LITE
LIVE PREVIEW

Introduction to Compressed Sensing Gitta Kutyniok (Institut f ur - - PowerPoint PPT Presentation

Introduction to Compressed Sensing Gitta Kutyniok (Institut f ur Mathematik, Technische Universit at Berlin) Winter School on Compressed Sensing, TU Berlin December 35, 2015 Gitta Kutyniok (TU Berlin) Introduction to Compressed


slide-1
SLIDE 1

Introduction to Compressed Sensing

Gitta Kutyniok

(Institut f¨ ur Mathematik, Technische Universit¨ at Berlin)

Winter School on “Compressed Sensing”, TU Berlin December 3–5, 2015

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 1 / 40

slide-2
SLIDE 2

Outline

1

Modern Data Processing Data Deluge Information Content of Data Why do we need Compressed Sensing?

2

Main Ideas of Compressed Sensing Sparsity Measurement Matrices Recovery Algorithms

3

Applications

4

This Winter School

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 2 / 40

slide-3
SLIDE 3

The Age of Data

Problem of the 21th Century: We live in a digitalized world. Slogan: “Big Data”. New technologies produce/sense enormous amounts of data. Problems: Storage, Transmission, and Analysis. “Big Data Research and Development Initiative” Barack Obama (March 2012)

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 3 / 40

slide-4
SLIDE 4

Olympic Games 2012

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 4 / 40

slide-5
SLIDE 5

Better, Stronger, Faster!

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 5 / 40

slide-6
SLIDE 6

Accelerating Data Deluge

Situation 2010: 1250 Billion Gigabytes generated in 2010: # digital bits > # stars in the universe Growing by a factor of 10 every 5 years.

Available transmission bandwidth

Observations: Total data generated > total storage Increases in generation rate >> increases in communication rate

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 6 / 40

slide-7
SLIDE 7

What can we do...?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 7 / 40

slide-8
SLIDE 8

Quote by Einstein

“Not everything that can be counted counts, and not everything that counts can be counted.” Albert Einstein

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 8 / 40

slide-9
SLIDE 9

An Applied Harmonic Analysis Viewpoint

Exploit a carefully designed representation system (ψλ)λ∈Λ ⊆ H: H ∋ f − → (f , ψλ)λ∈Λ − →

  • λ∈Λ

f , ψλ ψλ = f . Desiderata: Special features encoded in the “large” coefficients | f , ψλ |. Efficient representations: f ≈

  • λ∈ΛN

f , ψλ ψλ, #(ΛN) small Goals: Derive high compression by considering only the “large” coefficients. Modification of the coefficients according to the task.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 9 / 40

slide-10
SLIDE 10

Review of Wavelets for L2(R2)

Definition (1D): Let φ ∈ L2(R) be a scaling function and ψ ∈ L2(R) be a

  • wavelet. Then the associated wavelet system is defined by

{φ(x − m) : m ∈ Z} ∪ {2j/2 ψ(2jx − m) : j ≥ 0, m ∈ Z}.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 10 / 40

slide-11
SLIDE 11

Review of Wavelets for L2(R2)

Definition (1D): Let φ ∈ L2(R) be a scaling function and ψ ∈ L2(R) be a

  • wavelet. Then the associated wavelet system is defined by

{φ(x − m) : m ∈ Z} ∪ {2j/2 ψ(2jx − m) : j ≥ 0, m ∈ Z}. Definition (2D): A wavelet system is defined by {φ(1)(x − m) : m ∈ Z2} ∪ {2jψ(i)(2jx − m) : j ≥ 0, m ∈ Z2, i = 1, 2, 3}, where ψ(1)(x) = φ(x1)ψ(x2), φ(1)(x) = φ(x1)φ(x2) and ψ(2)(x) = ψ(x1)φ(x2), ψ(3)(x) = ψ(x1)ψ(x2).

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 10 / 40

slide-12
SLIDE 12

The World is Compressible!

N pixels k << N large wavelet coefficients N wideband signal samples k << N large Gabor coefficients

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 11 / 40

slide-13
SLIDE 13

JPEG2000

Kompression auf 1/20 Kompression auf 1/200

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 12 / 40

slide-14
SLIDE 14

The New Paradigm for Data Processing: Sparsity!

Sparse Signals: A signal x ∈ RN is k-sparse, if x0 = #non-zero coefficients ≤ k. Model Σk: Union of k-dimensional subspaces. Compressible Signals: A signal x ∈ RN is compressible, if the sorted coefficients have rapid (power law) decay. Model: ℓp ball with p ≤ 1.

|xi| k N

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 13 / 40

slide-15
SLIDE 15

“Not everything that can be counted counts...” (Einstein)

Classical Approach:

Sensing/Sampling Compression Reconstruction

x N k N ˆ x

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

slide-16
SLIDE 16

“Not everything that can be counted counts...” (Einstein)

Classical Approach:

Sensing/Sampling Compression Reconstruction

x N k N ˆ x

Sensing/Sampling:

◮ Linear processing.

Compression:

◮ Non-linear processing.

Why acquire N samples only to discard all but k pieces of data?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

slide-17
SLIDE 17

“Not everything that can be counted counts...” (Einstein)

Classical Approach:

Sensing/Sampling Compression Reconstruction

x N k N ˆ x

Sensing/Sampling:

◮ Linear processing.

Compression:

◮ Non-linear processing.

Why acquire N samples only to discard all but k pieces of data? Fundamental Idea: Directly acquire “compressed data”, i.e., the information content. Take more universal measurements:

Compressed Sensing Reconstruction

x

(k <)n(<< N)

N ˆ x

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 14 / 40

slide-18
SLIDE 18

Compressed Sensing enters the Stage

‘Initial’ Papers:

  • E. Cand`

es, J. Romberg, T. Tao, Stable signal recovery from incomplete and inaccurate measurements, Comm. Pure Appl. Math. 59 (2006), 1207–1223.

  • D. Donoho, Compressed sensing, IEEE Trans. Inform. Theory 52 (2006),

1289–1306.

Avalanche of Results (dsp.rice.edu/cs):

  • Approx. 2000 papers and 150 conferences so far.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 15 / 40

slide-19
SLIDE 19

Compressed Sensing enters the Stage

‘Initial’ Papers:

  • E. Cand`

es, J. Romberg, T. Tao, Stable signal recovery from incomplete and inaccurate measurements, Comm. Pure Appl. Math. 59 (2006), 1207–1223.

  • D. Donoho, Compressed sensing, IEEE Trans. Inform. Theory 52 (2006),

1289–1306.

Avalanche of Results (dsp.rice.edu/cs):

  • Approx. 2000 papers and 150 conferences so far.

Relation to the following areas: Applied harmonic analysis. Applied linear algebra. Convex optimization. Geometric functional analysis. Random matrix theory. Application areas: Radar, Astronomy, Biology, Seismology, Signal processing and more.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 15 / 40

slide-20
SLIDE 20

What is Compressed Sensing...?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 16 / 40

slide-21
SLIDE 21

Compressed Sensing Problem, I

General Procedure: Signal x ∈ RN. x is k-sparse. Take n << N linear, non-adaptive measurements using a matrix A.

= x A y

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 17 / 40

slide-22
SLIDE 22

Compressed Sensing Problem, I

General Procedure: Signal x ∈ RN. x is k-sparse. Take n << N linear, non-adaptive measurements using a matrix A.

= x A y

Viewpoints: Efficient sampling. Dimension reduction. Efficient representation.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 17 / 40

slide-23
SLIDE 23

Compressed Sensing Problem, II

= x A y

Fundamental Questions: What are suitable signal models? When and with which accuracy can the signal be recovered? What are suitable sensing matrices? How can the signal be algorithmically recovered?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 18 / 40

slide-24
SLIDE 24

Fundamental Theorem of Sparse Solutions

Definition: Let A be an n × N matrix. Then spark(A) denotes the minimal number of linearly dependent columns; spark(A) ∈ [2, n + 1].

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

slide-25
SLIDE 25

Fundamental Theorem of Sparse Solutions

Definition: Let A be an n × N matrix. Then spark(A) denotes the minimal number of linearly dependent columns; spark(A) ∈ [2, n + 1]. Lemma: Let A be an n × N matrix, and let k ∈ N. Then the following conditions are equivalent: (i) For every y ∈ Rn, there exists at most one x ∈ RN with x0 ≤ k such that y = Ax. (ii) k < spark(A)/2.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

slide-26
SLIDE 26

Fundamental Theorem of Sparse Solutions

Definition: Let A be an n × N matrix. Then spark(A) denotes the minimal number of linearly dependent columns; spark(A) ∈ [2, n + 1]. Lemma: Let A be an n × N matrix, and let k ∈ N. Then the following conditions are equivalent: (i) For every y ∈ Rn, there exists at most one x ∈ RN with x0 ≤ k such that y = Ax. (ii) k < spark(A)/2. Sketch of Proof: Assume y = Ax0 = Ax1. Then x0 − x1 ∈ N(A).

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 19 / 40

slide-27
SLIDE 27

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solution x0 of y = Ax0 satisfies: x00 = #{i : x0

i = 0} is ‘small’, i.e., x0 is sparse.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 20 / 40

slide-28
SLIDE 28

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solution x0 of y = Ax0 satisfies: x00 = #{i : x0

i = 0} is ‘small’, i.e., x0 is sparse.

Ideal: Solve... (P0) min

x x0 subject to y = Ax

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 20 / 40

slide-29
SLIDE 29

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solution x0 of y = Ax0 satisfies: x00 = #{i : x0

i = 0} is ‘small’, i.e., x0 is sparse.

Ideal: Solve... (P0) min

x x0 subject to y = Ax

Basis Pursuit (Chen, Donoho, Saunders; 1998) (P1) min

x x1 subject to y = Ax

− → This can be solved by linear programming!

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 20 / 40

slide-30
SLIDE 30

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solution x0 of y = Ax0 satisfies: x00 = #{i : x0

i = 0} is ‘small’, i.e., x0 is sparse.

Ideal: Solve... (P0) min

x x0 subject to y = Ax

Basis Pursuit (Chen, Donoho, Saunders; 1998) (P1) min

x x1 subject to y = Ax

− → This can be solved by linear programming! Meta-Result: If the solution x0 is sufficiently sparse, and A is sufficiently incoherent, then x0 can be recovered from y via (P1).

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 20 / 40

slide-31
SLIDE 31

ℓ1 promotes Sparsity!

{x : y = Ax} min x2 s.t. y = Ax min x1 s.t. y = Ax

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 21 / 40

slide-32
SLIDE 32

Equivalent Condition for Uniqueness of ℓ1

Reminder: spark(A) = min{k : N(A) ∩ Σk = {0}}. Definition: Let A be an n × N matrix. Then A has the null space property of order k, if, for all h ∈ N(A) \ {0} and for all index sets |Λ| ≤ k, 1Λh1 < 1

2h1.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 22 / 40

slide-33
SLIDE 33

Equivalent Condition for Uniqueness of ℓ1

Reminder: spark(A) = min{k : N(A) ∩ Σk = {0}}. Definition: Let A be an n × N matrix. Then A has the null space property of order k, if, for all h ∈ N(A) \ {0} and for all index sets |Λ| ≤ k, 1Λh1 < 1

2h1.

Theorem (Cohen, Dahmen, DeVore; 2008): Let A be an n × N matrix, and let k ∈ N. The following are equivalent: (i) For every y ∈ Rn, there exists at most one solution in Σk of min

x x1 subject to y = Ax.

(ii) A satisfies the null space property of order k.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 22 / 40

slide-34
SLIDE 34

Sufficient Condition for ‘ℓ0 = ℓ1’: Coherence

Definition: Let A = (ai)N

i=1 be an n × N matrix. Then its coherence µ(A) is

µ(A) = max

i=j

| ai, aj | ai2aj2 ∈

  • N − n

n(N − 1), 1

  • .

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 23 / 40

slide-35
SLIDE 35

Sufficient Condition for ‘ℓ0 = ℓ1’: Coherence

Definition: Let A = (ai)N

i=1 be an n × N matrix. Then its coherence µ(A) is

µ(A) = max

i=j

| ai, aj | ai2aj2 ∈

  • N − n

n(N − 1), 1

  • .

Theorem (Elad, Bruckstein; 2002) (Donoho, Elad; 2003): Let A be an n × N matrix, and let x0 ∈ RN \ {0} satisfy x00 < 1

2(1 + µ(A)−1).

Then x0 is the unique solution of min

x x0 s.t. Ax0 = Ax

and min

x x1 s.t. Ax0 = Ax.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 23 / 40

slide-36
SLIDE 36

Sufficient Condition for ‘ℓ0 = ℓ1’: RIP

Again Key Idea: Sparsity Our signal is k-sparse:

=

Φ is effectively an n × k Matrix

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 24 / 40

slide-37
SLIDE 37

Sufficient Condition for ‘ℓ0 = ℓ1’: RIP

Again Key Idea: Sparsity Our signal is k-sparse:

=

Φ is effectively an n × k Matrix

= ⇒ Design Φ so that each of its n × k submatrices is full rank! Definition: Let A be an n × N matrix. Then A has the Restricted Isometry Property (RIP) of order k, if there exists δk ∈ (0, 1) with (1 − δk)x2

2 ≤ Ax2 2 ≤ (1 + δk)x2 2

∀x ∈ Σk.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 24 / 40

slide-38
SLIDE 38

Restricted Isometry Property (RIP)

Stable Embedding: Φ shall preserve the geometry of the set of sparse signals:

Φ x2 x1 Φ(x1) Φ(x2)

Restricted Isometry Property: x1 − x2 ≈ Φ(x1) − Φ(x2).

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 25 / 40

slide-39
SLIDE 39

Restricted Isometry Property (RIP)

Stable Embedding: Φ shall preserve the geometry of the set of sparse signals:

Φ x2 x1 Φ(x1) Φ(x2)

Restricted Isometry Property: x1 − x2 ≈ Φ(x1) − Φ(x2). But this is a combinatorial NP-hard design problem!

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 25 / 40

slide-40
SLIDE 40

Insight from Banach Space Theory

General Approach to RIP: Based on work by Garnaev, Gluskin, and Kushin (77’ & 84’) Design Φ to be a random matrix, e.g.

◮ Gaussian iid ◮ Bernoulli (±1) iid ◮ ...

Such matrices Φ have the Restricted Isometry Property with high probability, if n = O(k · log N k

  • ) << N.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 26 / 40

slide-41
SLIDE 41

Sufficient Condition for ‘ℓ0 = ℓ1’: RIP

Theorem (Cohen, Dahmen, DeVore; 2008) (Cand` es; 2008): Let A be an n × N matrix which satisfies the RIP of order 2k with δ2k < √ 2 − 1, and let x0 ∈ RN. Then the solution x of min

x x1 subject to Ax0 = Ax.

satisfies x0 − x2 ≤ C · σk(x0)1 √ k

  • ,

where σk(x0)1 is the ℓ1-error of best k-term approximation to x0, i.e., σk(x0)1 := inf

y∈Σk

x0 − y1.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 27 / 40

slide-42
SLIDE 42

Sensing Matrices and Recovery Algorithms...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 28 / 40

slide-43
SLIDE 43

Sensing Matrices

Deterministic Matrices: n × N-Vandermonde matrix: spark(A) = n + 1, but poorly conditioned. n × n2-equiangular tight frames (Strohmer, Heath; 2003): µ(A) = 1 √n, then n = O(k2 log N), but N = n2. n × N-matrices (Bourgain, DeVore, Haupt, et al.; 2007–): n k2−µ, but µ is very small.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 29 / 40

slide-44
SLIDE 44

Sensing Matrices

Random Matrices: n × N-matrix with i.i.d. entries: spark(A) = n + 1 with probability 1. n × N-matrix with subgaussian distr. (Cand` es, Donoho, et al.; 2006–): If n = O(k log(N/k)), then A satisfies the RIP of order δ2k with prob. at least 1 − 2e−c·n ‘overwhelmingly high probability’. Question: How far can we get with deterministic matrices?

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 30 / 40

slide-45
SLIDE 45

Sparse Recovery Algorithms: ℓ1 Minimization

Convex Problem: min

x x1 subject to y = Ax

Convex problem with a conic constraint: min

x x1 subject to Ax − y2 2 ≤ ε

− → Specialized algorithms for Compressed Sensing! − → www.acm.caltech.edu/l1magic and sparselab.stanford.edu! Equivalent formulation: Unconstrained version: min

x 1 2Ax − y2 2 + λx1

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 31 / 40

slide-46
SLIDE 46

Sparse Recovery Algorithms: Greedy and Combinatorial

Greedy Algorithms: Orthogonal Matching Pursuit Iterative Thresholding ... Combinatorial Algorithms: Combinatorial group testing Data streams ...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 32 / 40

slide-47
SLIDE 47

Compressed Sensing in Action...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 33 / 40

slide-48
SLIDE 48

Application Areas of Compressed Sensing

Compressed Sensing

Imaging Sciences Radar Technology Communikations Theory Information Theory Biology Geology/ Seismology Astronomy Optics Business Remote Sensing Compression/ Dimension Reduktion Medicine Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 34 / 40

slide-49
SLIDE 49

Further Applications

Astronomy

◮ Cosmic Microwave Background, Planck mission, ...

Communication

◮ Channel estimation, (sensor) networks,...

Computational Biology

◮ DNA microarrays, ...

Geophysical Data Analysis

◮ Seismic data recovery, wavefield extrapolation, ...

Photography

◮ Single-pixel camera,...

Physics

◮ Simulation of atomic systems, quantum state tomography, ...

...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 35 / 40

slide-50
SLIDE 50

Topics of this Winter School

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 36 / 40

slide-51
SLIDE 51

Winter School on “Compressed Sensing”

Topics: Model of sparse vectors Model of low-rank matrices (Rachel Ward) Model of sparse vectors Model of sparse lattice vectors (Axel Flinth) Measurement matrices Structured random matrices (Holger Rauhut) Measurements Non-linearity (Roman Vershynin and Rachel Ward) Application/Extension: Recovery of high-dimensional functions (Massimo Fornasier) Applications: Data separation, missing data recovery & Fourier data (Gitta Kutyniok) Applications: Proteomics analysis & MRI (Martin Genzel & Jackie Ma)

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 37 / 40

slide-52
SLIDE 52

Let’s conclude...

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 38 / 40

slide-53
SLIDE 53

Conclusions

Sparsity is a natural model for signals. Compressed Sensing:

Sparse high-dimensional signals can be recovered efficiently from a small set of linear, non-adaptive measurements!

Various connections to different areas inside mathematics and across disciplines. Examples of applications:

◮ Astronomy. ◮ Biology. ◮ Communication. ◮ Radar. ◮ ...

Compressed Sensing for Future Technologies: Great potential, but wide open field!

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 39 / 40

slide-54
SLIDE 54

Technische Universität Berlin Applied Functional Analysis Group

THANK YOU!

Contact:

www.math.tu-berlin.de/∼kutyniok

Code available at:

www.ShearLab.org

Related Books:

  • Y. Eldar and G. Kutyniok

Compressed Sensing: Theory and Applications Cambridge University Press, 2012.

  • S. Foucart and H. Rauhut

A Mathematical Introduction to Compressive Sensing Birkh¨ auser-Springer, 2013.

Gitta Kutyniok (TU Berlin) Introduction to Compressed Sensing Winter School 2015 40 / 40