Lecture 1: Applied Harmonic Analysis and Compressed Sensing Gitta - - PowerPoint PPT Presentation

lecture 1 applied harmonic analysis and compressed sensing
SMART_READER_LITE
LIVE PREVIEW

Lecture 1: Applied Harmonic Analysis and Compressed Sensing Gitta - - PowerPoint PPT Presentation

Lecture 1: Applied Harmonic Analysis and Compressed Sensing Gitta Kutyniok (Technische Universit at Berlin) Winter School on Compressed Sensing, TU Berlin December 35, 2015 Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015


slide-1
SLIDE 1

Lecture 1: Applied Harmonic Analysis and Compressed Sensing

Gitta Kutyniok

(Technische Universit¨ at Berlin)

Winter School on “Compressed Sensing”, TU Berlin December 3–5, 2015

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 1 / 50

slide-2
SLIDE 2

Outline

1

The Geometric Separation Problem Inspiring Empirical Results Goal for Today

2

Separation via Compressed Sensing Sparsity and Underdetermined Systems Avalanche of Recent Work

3

Separation of Points and Curves Wavelet and Shearlet Systems Asymptotic Separation Result General Separation Estimate Microlocal Analysis Heuristics

4

Conclusions

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 2 / 50

slide-3
SLIDE 3

General Challenge in Data Analysis

Modern Data in general is often composed of two or more morphologically distinct constituents, and we face the task of separating those components given the composed data. Examples include... Audio data: Sinusoids and peaks. Imaging data: Cartoon and texture. High-dimensional data: Lower-dimensional structures

  • f different dimensions.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 3 / 50

slide-4
SLIDE 4

Separating Artifacts in Images, I

+ +

(Source: J. L. Starck, M. Elad, D. L. Donoho; 2005 (Artificial Data)) Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 4 / 50

slide-5
SLIDE 5

Separating Artifacts in Images, II

+ + +

(Source: J. L. Starck, M. Elad, D. L. Donoho; 2005) Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 5 / 50

slide-6
SLIDE 6

Problem from Neurobiology

Alzheimer Research: Detection of characteristics of Alzheimer. Separation of spines and dendrites.

(Confocal-Laser Scanning-Microscopy) Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 6 / 50

slide-7
SLIDE 7

Numerical Result

+

(Source: Brandt, K, Lim, S¨ undermann; 2010) Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 7 / 50

slide-8
SLIDE 8

Goal for Today

Neurobiological Data: Observed signal x = x1 + x2. x1 = Point structures. x2 = Curvilinear structures. Challenges for Today: Mathematical methodology to derive the empirical results! Fundamental mathematical concept behind the empirical results!

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 8 / 50

slide-9
SLIDE 9

Applied Harmonic Analysis Approach to Imaging Science

Exploit a carefully designed representation system (ψλ)λ∈Λ ⊆ L2(R2): L2(R2) ∋ f − → (f , ψλ)λ∈Λ − →

  • λ∈Λ

f , ψλ ψλ = f . Desiderata: Special features encoded in the “large” coefficients | f , ψλ |. Efficient representations: f ≈

  • λ∈ΛN

f , ψλ ψλ, #(ΛN) small Methodology: Modification of the coefficients according to the task.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 9 / 50

slide-10
SLIDE 10

How does Compressed Sensing help with Component Separation?

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 10 / 50

slide-11
SLIDE 11

‘Mathematical Model’

Model for 2 Components: Observe a signal x composed of two subsignals x1 and x2: x = x1 + x2. Extract the two subsignals x1 and x2 from x, if only x is known.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 11 / 50

slide-12
SLIDE 12

‘Mathematical Model’

Model for 2 Components: Observe a signal x composed of two subsignals x1 and x2: x = x1 + x2. Extract the two subsignals x1 and x2 from x, if only x is known. Isn’t this impossible? There are two unknowns for every datum.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 11 / 50

slide-13
SLIDE 13

‘Mathematical Model’

Model for 2 Components: Observe a signal x composed of two subsignals x1 and x2: x = x1 + x2. Extract the two subsignals x1 and x2 from x, if only x is known. Isn’t this impossible? There are two unknowns for every datum. But we have additional Information: The two components are geometrically different.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 11 / 50

slide-14
SLIDE 14

Birth of ℓ1-Component Separation (2001)

Composition of Sinusoids and Spikes sampled at n points: x = x0

1 + x0 2 = Φ1c0 1 + Φ2c0 2 = [

Φ1 | Φ2 ] c0

1

c0

2

  • ,

where x, c0

1, and c0 2 are n × 1.

Φ1 is the n × n-Fourier matrix ((Φ1)t,k = e2πitk/n). Φ2 is the n × n-Identity matrix.

50 100 150 200 250

  • 1
  • 0.5

0.5 1

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 12 / 50

slide-15
SLIDE 15

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solution c0 of x = Ac0 satisfies: c00 = #{i : c0

i = 0} is ‘small’, i.e., c0 is sparse.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 13 / 50

slide-16
SLIDE 16

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solution c0 of x = Ac0 satisfies: c00 = #{i : c0

i = 0} is ‘small’, i.e., c0 is sparse.

Ideal: Solve... (P0) min

c c0 subject to x = Ac

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 13 / 50

slide-17
SLIDE 17

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solution c0 of x = Ac0 satisfies: c00 = #{i : c0

i = 0} is ‘small’, i.e., c0 is sparse.

Ideal: Solve... (P0) min

c c0 subject to x = Ac

Basis Pursuit (Chen, Donoho, Saunders; 1998) (P1) min

c c1 subject to x = Ac

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 13 / 50

slide-18
SLIDE 18

Sparsity and ℓ1

Assumption: Letting A be an n × N-matrix, n << N, the seeked solution c0 of x = Ac0 satisfies: c00 = #{i : c0

i = 0} is ‘small’, i.e., c0 is sparse.

Ideal: Solve... (P0) min

c c0 subject to x = Ac

Basis Pursuit (Chen, Donoho, Saunders; 1998) (P1) min

c c1 subject to x = Ac

Meta-Result: If the solution c0 is sufficiently sparse, and A is sufficiently incoherent, then c0 can be recovered from x via (P1).

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 13 / 50

slide-19
SLIDE 19

First Results of Compressed Sensing

Composition of Sinusoids and Spikes sampled at n points: x = x0

1 + x0 2 = Φ1c0 1 + Φ2c0 2 = [

Φ1 | Φ2 ] c0

1

c0

2

  • .

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 14 / 50

slide-20
SLIDE 20

First Results of Compressed Sensing

Composition of Sinusoids and Spikes sampled at n points: x = x0

1 + x0 2 = Φ1c0 1 + Φ2c0 2 = [

Φ1 | Φ2 ] c0

1

c0

2

  • .

Theorem (Donoho, Huo; 2001) If #(Sinusoids) + #(Spikes) = (c0

1)0 + (c0 2)0 < (1 + √n)/2, then

(c0

1, c0 2) = argmin(c11 + c21) subject to x = Φ1c1 + Φ2c2.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 14 / 50

slide-21
SLIDE 21

First Results of Compressed Sensing

Composition of Sinusoids and Spikes sampled at n points: x = x0

1 + x0 2 = Φ1c0 1 + Φ2c0 2 = [

Φ1 | Φ2 ] c0

1

c0

2

  • .

Theorem (Donoho, Huo; 2001) If #(Sinusoids) + #(Spikes) = (c0

1)0 + (c0 2)0 < (1 + √n)/2, then

(c0

1, c0 2) = argmin(c11 + c21) subject to x = Φ1c1 + Φ2c2.

Theorem (Bruckstein, Elad; 2002)(Donoho, Elad; 2003) Let A = (ai)N

i=1 be an n × N-matrix with normalized columns, n << N,

and let c0 satisfy c00 < 1 2

  • 1 + µ(A)−1

, with coherence µ(A) = maxi=j |ai, aj|. Then c0 = argminc1 subject to x = Ac.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 14 / 50

slide-22
SLIDE 22

Component Separation using Compressed Sensing

Let x be a signal composed of two subsignals x0

1 and x0 2:

x = x0

1 + x0 2.

Desiderata for two orthonormal bases Φ1 and Φ2: x0

i = Φic0 i with c0 i 0 small, i = 1, 2 Sparsity!

µ([Φ1|Φ2]) small Morphological Difference! Solve (c∗

1, c∗ 2) = argmin(c11 + c21) subject to x = Φ1c1 + Φ2c2

and derive the approximate components x0

i ≈ x∗ i = Φic∗ i ,

i = 1, 2.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 15 / 50

slide-23
SLIDE 23

Two Paths

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 16 / 50

slide-24
SLIDE 24

Avalanche of Recent Work

Problem: Solve x = Ac0 with A an n × N-matrix (n < N). Deterministic World: Mutual coherence of A = (ak)k. Bound c00 dependent on µ(A). Efficiently solve the problem x = Ac0. Contributors: Bruckstein, Cohen, Dahmen, DeVore, Donoho, Elad, Fuchs, Gribonval, Huo, K, Rauhut, Temlyakov, Tropp, ... Random World: Restricted isometry constants of a random A = (ak)k. Bound c00 by n/(2 log(N/n))(1 + o(1)). Efficiently solve the problem x = Ac0 with high probability. Contributors: Cand` es, Donoho, Fornasier, K, Krahmer, Rauhut, Romberg, Tanner, Tao, Tropp, Vershynin, Ward, ...

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 17 / 50

slide-25
SLIDE 25

Novel Direction for Sparsity

Geometric Clustering: x = Ac0 with A an n × N-matrix (n < N). Nonzeros of c0 often

◮ arise not in arbitrary patterns, ◮ but are rather highly structured.

Interactions between columns of A in ill-posed problems

◮ is not arbitrary, ◮ but rather geometrically driven. Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 18 / 50

slide-26
SLIDE 26

Novel Direction for Sparsity

Geometric Clustering: x = Ac0 with A an n × N-matrix (n < N). Nonzeros of c0 often

◮ arise not in arbitrary patterns, ◮ but are rather highly structured.

Interactions between columns of A in ill-posed problems

◮ is not arbitrary, ◮ but rather geometrically driven.

Other results on “structured sparsity”: Joint sparsity, fusion frame sparsity, block sparsity, ... Contributors: Boufounos, Ehler, Eldar, Gribonval, Fornasier, K, Rauhut, Schnass, Vandergheynst, Vershynin, Ward, ...

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 18 / 50

slide-27
SLIDE 27

How can these Ideas be applied to Separation of Points and Curves?

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 19 / 50

slide-28
SLIDE 28

Back to Neurobiological Imaging

Two morphologically distinct components:

◮ Points ◮ Curves

Choose suitable representation systems which provide optimally sparse representations of

◮ pointlike structures −

→ Wavelets

◮ curvelike structures −

→ Shearlets

Minimize the ℓ1 norm of the coefficients. This forces

◮ the pointlike objects into the wavelet part of the expansion ◮ the curvelike objects into the shearlet part. Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 20 / 50

slide-29
SLIDE 29

Empirical Separation of Spines and Dendrites

+

Wavelet Expansion Shearlet Expansion (Source: Brandt, K, Lim, S¨ undermann; 2010) Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 21 / 50

slide-30
SLIDE 30

Chosen Pair

Optimal for Pointlike Structures: Orthonormal Wavelets are a basis with perfectly isotropic generating elements at different scales. Optimal for Curvelike Structures: Shearlets (K, Labate; 2006) are a highly directional frame with increasingly anisotropic elements at fine scales (− → www.ShearLab.org).

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 22 / 50

slide-31
SLIDE 31

Review of Wavelets for L2(R2)

Definition (1D): Let φ ∈ L2(R) be a scaling function and ψ ∈ L2(R) be a

  • wavelet. Then the associated wavelet system is defined by

{φ(x − m) : m ∈ Z} ∪ {2j/2 ψ(2jx − m) : j ≥ 0, m ∈ Z}.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 23 / 50

slide-32
SLIDE 32

Review of Wavelets for L2(R2)

Definition (1D): Let φ ∈ L2(R) be a scaling function and ψ ∈ L2(R) be a

  • wavelet. Then the associated wavelet system is defined by

{φ(x − m) : m ∈ Z} ∪ {2j/2 ψ(2jx − m) : j ≥ 0, m ∈ Z}. Definition (2D): A wavelet system is defined by {φ(1)(x − m) : m ∈ Z2} ∪ {2jψ(i)(2jx − m) : j ≥ 0, m ∈ Z2, i = 1, 2, 3}, where ψ(1)(x) = φ(x1)ψ(x2), φ(1)(x) = φ(x1)φ(x2) and ψ(2)(x) = ψ(x1)φ(x2), ψ(3)(x) = ψ(x1)ψ(x2). Theorem: Discrete wavelets provide optimally sparse approximations for functions f ∈ L2(R2), which are C 2 apart from point singularities: f − fN2

2 ≍ N−1,

N → ∞, where fN =

  • λ∈ΛN

cλψλ.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 23 / 50

slide-33
SLIDE 33

Fitting Model for Anisotropic Structures

Field et al., 1993 Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 24 / 50

slide-34
SLIDE 34

Fitting Model for Anisotropic Structures

Field et al., 1993

Definition (Donoho; 2001): The set of cartoon-like images E2(R2) is defined by E2(R2) = {f ∈ L2(R2) : f = f0 + f1 · χB}, where B ⊂ [0, 1]2 with ∂B a closed C 2-curve, f0, f1 ∈ C 2

0 ([0, 1]2).

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 24 / 50

slide-35
SLIDE 35

Fitting Model for Anisotropic Structures

Field et al., 1993

Definition (Donoho; 2001): The set of cartoon-like images E2(R2) is defined by E2(R2) = {f ∈ L2(R2) : f = f0 + f1 · χB}, where B ⊂ [0, 1]2 with ∂B a closed C 2-curve, f0, f1 ∈ C 2

0 ([0, 1]2).

Theorem (Donoho; 2001): Let (ψλ)λ ⊆ L2(R2). Allowing only polynomial depth search, the optimal asymptotic approximation error of f ∈ E2(R2) is f − fN2

2 ≍ N−2,

N → ∞, where fN =

  • λ∈ΛN

cλψλ.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 24 / 50

slide-36
SLIDE 36

Beyond Wavelets...

Observation: Wavelets only achieve f − fN2

2 ≍ N−1,

N → ∞. Wavelets can not approximate curvilinear singularities optimally sparse. Reason: Isotropic structure of wavelets: 2jψ( 2j 2j

  • x − m)

Intuitive explanation:

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 25 / 50

slide-37
SLIDE 37

Main Goal in Geometric Multiscale Analysis

Design a representation system which... ...is generated by one ‘mother function’, ...provides optimally sparse approximation of cartoons, ...allows for compactly supported analyzing elements, ...is associated with fast decomposition algorithms, ...treats the continuum and digital ‘world’ uniformly.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 26 / 50

slide-38
SLIDE 38

Main Goal in Geometric Multiscale Analysis

Design a representation system which... ...is generated by one ‘mother function’, ...provides optimally sparse approximation of cartoons, ...allows for compactly supported analyzing elements, ...is associated with fast decomposition algorithms, ...treats the continuum and digital ‘world’ uniformly. Non-exhaustive list of approaches: Ridgelets (Cand` es and Donoho; 1999) Curvelets (Cand` es and Donoho; 2002) Contourlets (Do and Vetterli; 2002) Bandlets (LePennec and Mallat; 2003) Shearlets (K and Labate; 2006)

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 26 / 50

slide-39
SLIDE 39

Scaling and Orientation

Parabolic scaling: Aj = 2j 2j/2

  • ,

j ∈ Z. Historical remark: 1970’s: Fefferman und Seeger/Sogge/Stein.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 27 / 50

slide-40
SLIDE 40

Scaling and Orientation

Parabolic scaling: Aj = 2j 2j/2

  • ,

j ∈ Z. Historical remark: 1970’s: Fefferman und Seeger/Sogge/Stein. Orientation via shearing: Sk = 1 k 1

  • ,

k ∈ Z. Advantage: Shearing leaves the digital grid Z2 invariant. Uniform theory for the continuum and digital situation.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 27 / 50

slide-41
SLIDE 41

(Cone-adapted) Discrete Shearlet Systems

Definition (K, Labate; 2006): The (cone-adapted) discrete shearlet system SH(φ, ψ, ˜ ψ) generated by φ ∈ L2(R2) and ψ, ˜ ψ ∈ L2(R2) is the set {φ(· − m) : m ∈ Z2} ∪{23j/4ψ(SkAj · −m) : j ≥ 0, |k| ≤ ⌈2j/2⌉, m ∈ Z2} ∪{23j/4 ˜ ψ( ˜ Sk ˜ Aj · −m) : j ≥ 0, |k| ≤ ⌈2j/2⌉, m ∈ Z2}.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 28 / 50

slide-42
SLIDE 42

(Cone-adapted) Discrete Shearlet Systems

Definition (K, Labate; 2006): The (cone-adapted) discrete shearlet system SH(φ, ψ, ˜ ψ) generated by φ ∈ L2(R2) and ψ, ˜ ψ ∈ L2(R2) is the set {φ(· − m) : m ∈ Z2} ∪{23j/4ψ(SkAj · −m) : j ≥ 0, |k| ≤ ⌈2j/2⌉, m ∈ Z2} ∪{23j/4 ˜ ψ( ˜ Sk ˜ Aj · −m) : j ≥ 0, |k| ≤ ⌈2j/2⌉, m ∈ Z2}. General Framework: Parabolic Molecules (Grohs, K; 2013) α-Molecules (Grohs, Keiper, K, Sch¨ afer; 2014)

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 28 / 50

slide-43
SLIDE 43

Compactly Supported Shearlets

Theorem (Kittipoom, K, Lim; 2012): Let φ, ψ, ˜ ψ ∈ L2(R2) be compactly supported, and let ˆ ψ, ˆ ˜ ψ satisfy certain decay condition. Then SH(φ, ψ, ˜ ψ) forms a shearlet frame with controllable frame bounds. Remark: Exemplary class with B/A ≈ 4.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 29 / 50

slide-44
SLIDE 44

Compactly Supported Shearlets

Theorem (Kittipoom, K, Lim; 2012): Let φ, ψ, ˜ ψ ∈ L2(R2) be compactly supported, and let ˆ ψ, ˆ ˜ ψ satisfy certain decay condition. Then SH(φ, ψ, ˜ ψ) forms a shearlet frame with controllable frame bounds. Remark: Exemplary class with B/A ≈ 4. Theorem (K, Lim; 2011): Let φ, ψ, ˜ ψ ∈ L2(R2) be compactly supported, and let ˆ ψ, ˆ ˜ ψ satisfy certain decay condition. Then SH(φ, ψ, ˜ ψ) provides an optimally sparse approximation of f ∈ E2(R2), i.e., f − fN2

2 ≤ C · N−2 · (log N)3,

N → ∞.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 29 / 50

slide-45
SLIDE 45

Recent Approaches to Fast Shearlet Transforms

www.ShearLab.org: Separable Shearlet Transform (Lim; 2009) Digital Shearlet Transform (K, Shahram, Zhuang; 2011) 2D&3D (parallelized) Shearlet Transform (K, Lim, Reisenhofer; 2013) Additional Code: Filter-based implementation (Easley, Labate, Lim; 2009) Theoretical Approaches: Adaptive Directional Subdivision Schemes (K, Sauer; 2009) Shearlet Unitary Extension Principle (Han, K, Shen; 2011) Gabor Shearlets (Bodmann, K, Zhuang; 2013)

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 30 / 50

slide-46
SLIDE 46

Chosen Pair

Optimal for Pointlike Structures: Orthonormal Wavelets are a basis with perfectly isotropic generating elements at different scales. Optimal for Curvelike Structures: Shearlets (K, Labate; 2006) are a highly directional frame with increasingly anisotropic elements at fine scales (− → www.ShearLab.org).

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 31 / 50

slide-47
SLIDE 47

Microlocal Model

Neurobiological Geometric Mixture in 2D: Point Singularity: P(x) =

P

  • i=1

|x − xi|−3/2 Curvilinear Singularity: C =

  • δτ(t)dt,

τ a closed C 2-curve. Observed Signal: f = P + C

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 32 / 50

slide-48
SLIDE 48

Scale-Dependent Decomposition

Observed Object: f = P + C. Subband Decomposition: Wavelets and shearlets use the same scaling subbands! fj = Pj + Cj, Pj = P ⋆ Fj and Cj = C ⋆ Fj. ℓ1-Decomposition: (Wj, Sj) = argmin(Wj, ψλ)λ1 + (Sj, ση)η1 s.t. fj = Wj + Sj

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 33 / 50

slide-49
SLIDE 49

Asymptotic Separation

Theorem (Donoho, K; 2013) Wj − Pj2 + Sj − Cj2 Pj2 + Cj2 → 0, j → ∞. At all sufficiently fine scales, nearly-perfect separation is achieved!

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 34 / 50

slide-50
SLIDE 50

Analysis of Decomposition within one Scale

Signal Model: x = x0

1 + x0 2 ∈ H

Remarks: Given two Parseval frames Φ1, Φ2 (Φi(ΦT

i x) = x for all x).

Too many decompositions x = Φ1c1 + Φ2c2. Use x = Φ1(ΦT

1 x1) + Φ2(ΦT 2 x2), where x = x1 + x2.

Norm is placed on analysis rather than synthesis side. Decomposition Technique: (x⋆

1, x⋆ 2) = argminx1,x2ΦT 1 x11 + ΦT 2 x21 subject to x = x1 + x2

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 35 / 50

slide-51
SLIDE 51

Relative Sparsity and Cluster Coherence

Let Φ1 = (ϕ1,i)i∈I1 and Φ2 = (ϕ2,i)i∈I2. Definition: For each i = 1, 2, x0

i is relatively sparse in Φi w.r.t. Λi, if

1Λc

1ΦT

1 x0 11 + 1Λc

2ΦT

2 x0 21 ≤ δ.

We call Λ1 and Λ2 sets of significant coefficients. We define cluster coherence for Λ1 by µc(Λ1) = max

j∈I2

  • i∈Λ1

|ϕ1,i, ϕ2,j|.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 36 / 50

slide-52
SLIDE 52

Central Estimate

Theorem (Donoho, K; 2013): Suppose x0

1 and x0 2 are relatively sparse with Λ1 and Λ2 sets of significant

  • coefficients. Then

x⋆

1 − x0 12 + x⋆ 2 − x0 22 ≤

2δ 1 − 2µc , where µc = max(µc(Λ1), µc(Λ2)). δ: Relative sparsity measure. µc: Cluster coherence.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 37 / 50

slide-53
SLIDE 53

Application of Previous Result

x: Filtered signal fj (= Pj + Cj). Φ1: Wavelets filtered with Fj. Φ2: Shearlets filtered with Fj. Λ1: Significant wavelet coefficients of ψλ, Pj. Λ2: Significant shearlet coefficients of ση, Cj. δ: Degree of approximation by significant coefficients. µc(Λ1), µc(Λ2): Cluster coherence of wavelets-shearlets. Estimate of error:

2δ 1−2µc .

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 38 / 50

slide-54
SLIDE 54

Application of Previous Result

x: Filtered signal fj (= Pj + Cj). Φ1: Wavelets filtered with Fj. Φ2: Shearlets filtered with Fj. Λ1: Significant wavelet coefficients of ψλ, Pj? Λ2: Significant shearlet coefficients of ση, Cj? δ: Degree of approximation by significant coefficients. µc(Λ1), µc(Λ2): Cluster coherence of wavelets-shearlets. Estimate of error:

2δ 1−2µc = o(Pj2 + Cj2) as j → ∞.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 39 / 50

slide-55
SLIDE 55

Singular Support and Wavefront Set of P and C

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 40 / 50

slide-56
SLIDE 56

Phase Space Portrait of Wavelets and Shearlets

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 41 / 50

slide-57
SLIDE 57

Cluster Coherence

Wavelets in Λ1 ≈ vertical tubes clustering around the point singularities of P. Shearlets in Λ2 ≈ tubes clustering around the curvilinear phase portrait of C. Single wavelet is incoherent with ensemble of shearlets in Λ2. Single shearlet is incoherent with ensemble of wavelets in Λ1.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 42 / 50

slide-58
SLIDE 58

Key Idea from Microlocal Analysis

Hart Smith’s Phase Space Metric: d((s, t); (s′, t′)) =

  • es, t − t′

+

  • es′, t − t′
  • +|t − t′|2 + |s − s′|2.

‘Approximate’ Sets of Significant Wavelet Coefficients: Λ1,j = {wavelet lattice} ∩ {(s, t) : d((s, t); WF(P)) ≤ ηjaj}. ‘Approximate’ Sets of Significant Shearlet Coefficients: Λ2,j = {shearlet lattice} ∩ {(s, t) : d((s, t); WF(C)) ≤ ηjaj}.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 43 / 50

slide-59
SLIDE 59

Analysis of the Curvilinear Part

The diffeomorphism φi φi allows us to perform computations for distribution Lw: Lw, f = ρ

−ρ

w(t)f (t, 0)dt. Use linear operator Mφi for transformation; use the ‘model’ |Mφi(η, η′)| ≤ cN · 2|j−j′|(1 + min(2j, 2j′) · d((s, t), χφi(s′, t′)))−N

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 44 / 50

slide-60
SLIDE 60

Essential Estimates

Proposition: (Λ1,j) and (Λ2,j) have the following two properties:

◮ asymptotically negligible cluster coherences:

µc(Λ1,j), µc(Λ2,j) → 0, j → ∞.

◮ asymptotically negligible cluster approximation errors:

δj = δ1,j + δ2,j = o(Pj2 + Cj2), j → ∞.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 45 / 50

slide-61
SLIDE 61

Asymptotic Separation

Application of the abstract separation estimate then implies: Theorem (Donoho, K; 2013) Wj − Pj2 + Sj − Cj2 Pj2 + Cj2 → 0, j → ∞. At all sufficiently fine scales, nearly-perfect separation is achieved!

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 46 / 50

slide-62
SLIDE 62

Thresholding as Separation Strategy

Theorem (K; 2014) Wj − Pj2 + Sj − Cj2 Pj2 + Cj2 → 0, j → ∞. Asymptotically arbitrarily good separation!

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 47 / 50

slide-63
SLIDE 63

Thresholding as Separation Strategy

Theorem (K; 2014) Wj − Pj2 + Sj − Cj2 Pj2 + Cj2 → 0, j → ∞. Asymptotically arbitrarily good separation! Theorem (K; 2014) WF(

  • j

Fj ⋆ Wj) = WF(P) and WF(

  • j

Fj ⋆ Sj) = WF(C). Exact separation of the wavefront sets!

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 47 / 50

slide-64
SLIDE 64

Let’s conclude...

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 48 / 50

slide-65
SLIDE 65

What to take Home...?

One main task in imaging science: Component Separation. One approach to imaging science: Applied Harmonic Analysis. Compressed Sensing allows exact solution of underdetermined linear systems of equations if the solution is sparse and the matrix is incoherent. Separation of point- and curvelike structures:

◮ Wavelets sparsify points and shearlets sparsify curves. ◮ Morphological distance encoded in incoherence. ◮ Solution: ℓ1 minimization. Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 49 / 50

slide-66
SLIDE 66

Technische Universität Berlin Applied Functional Analysis Group

THANK YOU!

References available at:

www.math.tu-berlin.de/∼kutyniok

Code available at:

www.ShearLab.org

Related Books:

  • Y. Eldar and G. Kutyniok

Compressed Sensing: Theory and Applications Cambridge University Press, 2012.

  • G. Kutyniok and D. Labate

Shearlets: Multiscale Analysis for Multivariate Data Birkh¨ auser-Springer, 2012.

Gitta Kutyniok (TU Berlin) Lecture 1 Winter School 2015 50 / 50