Lecture 3: Applied Harmonic Analysis and Compressed Sensing Gitta - - PowerPoint PPT Presentation

lecture 3 applied harmonic analysis and compressed sensing
SMART_READER_LITE
LIVE PREVIEW

Lecture 3: Applied Harmonic Analysis and Compressed Sensing Gitta - - PowerPoint PPT Presentation

Lecture 3: Applied Harmonic Analysis and Compressed Sensing Gitta Kutyniok (Technische Universit at Berlin) Winter School on Compressed Sensing, TU Berlin December 35, 2015 Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015


slide-1
SLIDE 1

Lecture 3: Applied Harmonic Analysis and Compressed Sensing

Gitta Kutyniok

(Technische Universit¨ at Berlin)

Winter School on “Compressed Sensing”, TU Berlin December 3–5, 2015

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 1 / 32

slide-2
SLIDE 2

Fourier Sampling

Important Situation: Pointwise Samples of the Fourier transform! Applications: Magnetic Resonance Imaging (MRI) Electron Microscopy Fourier Optics X-ray Computed Tomography Reflection Seismology ... Common Model: Let f ∈ L2(R2) with additional regularity assumptions, and ∆ ⊆ Z2. Reconstruct f from (ˆ f (n))n∈∆ = (f , en)n∈∆, en(x) := e2πix,n.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 2 / 32

slide-3
SLIDE 3

Sampling of Fourier Data

(Source: Lim; 2014) Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 3 / 32

slide-4
SLIDE 4

General Sampling Strategy

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Orthonormal basis: − → Choice of {ψλ}λ∈Λ? {ψλ}λ∈Λ. Sparse representation: − → Model for f ? f =

  • λ∈Λ

cλψλ. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 4 / 32

slide-5
SLIDE 5

General Sampling Strategy

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Orthonormal basis: − → Choice of {ψλ}λ∈Λ? {ψλ}λ∈Λ. Sparse representation: − → Model for f ? f =

  • λ∈Λ

cλψλ. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 4 / 32

slide-6
SLIDE 6

General Sampling Strategy

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Orthonormal basis: − → Choice of {ψλ}λ∈Λ? {ψλ}λ∈Λ. Sparse representation: − → Model for f ? f =

  • λ∈Λ

cλψλ. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 4 / 32

slide-7
SLIDE 7

General Sampling Strategy

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Orthonormal basis: − → Choice of {ψλ}λ∈Λ? {ψλ}λ∈Λ. Sparse representation: − → Model for f ? f =

  • λ∈Λ

cλψλ. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 4 / 32

slide-8
SLIDE 8

General Sampling Strategy

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Orthonormal basis: − → Choice of {ψλ}λ∈Λ? {ψλ}λ∈Λ. Sparse representation: − → Model for f ? f =

  • λ∈Λ

cλψλ. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 4 / 32

slide-9
SLIDE 9

Compressed Sensing Type Approaches

Lustig, Donoho, Pauly; 2007 Sparse MRI: Spirals, L2(R2), Wavelets, ℓ1. min

g Ψg1

s.t. ˆ g|∆ − ˆ f |∆2 ≤ ε.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 5 / 32

slide-10
SLIDE 10

Compressed Sensing Type Approaches

Lustig, Donoho, Pauly; 2007 Sparse MRI: Spirals, L2(R2), Wavelets, ℓ1. min

g Ψg1

s.t. ˆ g|∆ − ˆ f |∆2 ≤ ε. Krahmer, Ward; 2014 Variable Density Sampling, CN×N, Haar Wavelets, TV.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 5 / 32

slide-11
SLIDE 11

Compressed Sensing Type Approaches

Lustig, Donoho, Pauly; 2007 Sparse MRI: Spirals, L2(R2), Wavelets, ℓ1. min

g Ψg1

s.t. ˆ g|∆ − ˆ f |∆2 ≤ ε. Krahmer, Ward; 2014 Variable Density Sampling, CN×N, Haar Wavelets, TV. Adcock, Hansen, K, Ma; 2014 Block Sampling, L2(R2), Wavelets, Generalized Sampling.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 5 / 32

slide-12
SLIDE 12

Compressed Sensing Type Approaches

Lustig, Donoho, Pauly; 2007 Sparse MRI: Spirals, L2(R2), Wavelets, ℓ1. min

g Ψg1

s.t. ˆ g|∆ − ˆ f |∆2 ≤ ε. Krahmer, Ward; 2014 Variable Density Sampling, CN×N, Haar Wavelets, TV. Adcock, Hansen, K, Ma; 2014 Block Sampling, L2(R2), Wavelets, Generalized Sampling. Adcock, Hansen, Poon, Roman; 2014 Multilevel Sampling, H, ONS, ℓ1.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 5 / 32

slide-13
SLIDE 13

Compressed Sensing Type Approaches

Lustig, Donoho, Pauly; 2007 Sparse MRI: Spirals, L2(R2), Wavelets, ℓ1. min

g Ψg1

s.t. ˆ g|∆ − ˆ f |∆2 ≤ ε. Krahmer, Ward; 2014 Variable Density Sampling, CN×N, Haar Wavelets, TV. Adcock, Hansen, K, Ma; 2014 Block Sampling, L2(R2), Wavelets, Generalized Sampling. Adcock, Hansen, Poon, Roman; 2014 Multilevel Sampling, H, ONS, ℓ1. Shi, Yin, Sankaranarayanan, Baraniuk; 2014 Dynamic MRI: Variable Density Sampling, R × Rn, Wavelets, ℓ1. ...

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 5 / 32

slide-14
SLIDE 14

Appropriate Notion of Optimality?

Ingredients: Continuum Model C ⊆ L2(R2).

◮ Acquiring data in a continuous world. ◮ Optimal best N-term approximation rate:

f − fN2 N−α as N → ∞ for all f ∈ C, where fN =

λ∈ΛN cλψλ for some frame (ψλ)λ∈Λ ⊆ L2(R2).

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 6 / 32

slide-15
SLIDE 15

Appropriate Notion of Optimality?

Ingredients: Continuum Model C ⊆ L2(R2).

◮ Acquiring data in a continuous world. ◮ Optimal best N-term approximation rate:

f − fN2 N−α as N → ∞ for all f ∈ C, where fN =

λ∈ΛN cλψλ for some frame (ψλ)λ∈Λ ⊆ L2(R2).

Sampling Schemes ∆M ⊆ Z2, #∆M = M and M → ∞.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 6 / 32

slide-16
SLIDE 16

Appropriate Notion of Optimality?

Ingredients: Continuum Model C ⊆ L2(R2).

◮ Acquiring data in a continuous world. ◮ Optimal best N-term approximation rate:

f − fN2 N−α as N → ∞ for all f ∈ C, where fN =

λ∈ΛN cλψλ for some frame (ψλ)λ∈Λ ⊆ L2(R2).

Sampling Schemes ∆M ⊆ Z2, #∆M = M and M → ∞. Reconstruction Procedure R : C × ∆ → L2(R2), ∆ =

M{∆M}.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 6 / 32

slide-17
SLIDE 17

Appropriate Notion of Optimality?

Ingredients: Continuum Model C ⊆ L2(R2).

◮ Acquiring data in a continuous world. ◮ Optimal best N-term approximation rate:

f − fN2 N−α as N → ∞ for all f ∈ C, where fN =

λ∈ΛN cλψλ for some frame (ψλ)λ∈Λ ⊆ L2(R2).

Sampling Schemes ∆M ⊆ Z2, #∆M = M and M → ∞. Reconstruction Procedure R : C × ∆ → L2(R2), ∆ =

M{∆M}.

Asymptotic Optimality: We call a sampling-reconstruction scheme (C, ∆, R) asymptotically optimal, if, for all f ∈ C, f − R(f , ∆M)2 M−α as M → ∞.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 6 / 32

slide-18
SLIDE 18

Let’s start with a suitable Model...

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 7 / 32

slide-19
SLIDE 19

Anisotropic/Cartoon Structures

Images: Governing structure in images. Justified by neurophysiology.

Field et al., 1993

Definition (Donoho; 2001): The set of cartoon-like functions E2(R2) is defined by E2(R2) = {f ∈ L2(R2) : f = f0 + f1 · χB}, where B ⊂ [0, 1]2 with ∂B a closed C 2-curve, f0, f1 ∈ C 2

0 ([0, 1]2).

Theorem (Donoho; 2001): Let (ψλ)λ ⊆ L2(R2) be a frame. Then the optimal asymptotic approximation error of f ∈ E2(R2) is f − fN2

2 ≍ N−2,

N → ∞, where fN =

  • λ∈ΛN

cλψλ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 8 / 32

slide-20
SLIDE 20

General Sampling Strategy

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Orthonormal basis: − → Choice of {ψλ}λ∈Λ? {ψλ}λ∈Λ. Sparse representation: − → Model for f ? f =

  • λ∈Λ

cλψλ. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 9 / 32

slide-21
SLIDE 21

General Sampling Strategy

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Orthonormal basis: − → Choice of {ψλ}λ∈Λ? {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλψλ, where f is a cartoon-like function. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 9 / 32

slide-22
SLIDE 22

Sparsifying Representation System

Parabolic scaling and shearing: Aj = 2j 2j/2

  • and

Sk = 1 k 1

  • ,

j, k ∈ Z. Definition (K, Labate; 2006): The (cone-adapted) discrete shearlet system SH(φ, ψ, ˜ ψ) generated by φ ∈ L2(R2) and ψ, ˜ ψ ∈ L2(R2) is the set {φ(· − m) : m ∈ Z2}, ∪{23j/4ψ(SkAj · −m) : j ≥ 0, |k| ≤ ⌈2j/2⌉, m ∈ Z2}, ∪{23j/4 ˜ ψ( ˜ Sk ˜ Aj · −m) : j ≥ 0, |k| ≤ ⌈2j/2⌉, m ∈ Z2}.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 10 / 32

slide-23
SLIDE 23

Problem with Frames

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Orthonormal basis: − → Choice of {ψλ}λ∈Λ? {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλψλ, where f is a cartoon-like function. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 11 / 32

slide-24
SLIDE 24

Problem with Frames

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Shearlet frame: {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλψλ, where f is a cartoon-like function. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 11 / 32

slide-25
SLIDE 25

Frame Theory

Let {ψλ}λ∈Λ be a frame for H: (f , ψλ)λ∈Λ2

ℓ2 ≍ f 2 for all f ∈ H.

Problem: In general, it is not true that f =

  • λ∈Λ

f , ψλ ψλ for all f ∈ H.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 12 / 32

slide-26
SLIDE 26

Frame Theory

Let {ψλ}λ∈Λ be a frame for H: (f , ψλ)λ∈Λ2

ℓ2 ≍ f 2 for all f ∈ H.

Problem: In general, it is not true that f =

  • λ∈Λ

f , ψλ ψλ for all f ∈ H. Definition: The frame operator associated to {ψλ}λ∈Λ is defined by S : H → H, Sf =

  • λ∈Λ

f , ψλ ψλ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 12 / 32

slide-27
SLIDE 27

Frame Theory

Let {ψλ}λ∈Λ be a frame for H: (f , ψλ)λ∈Λ2

ℓ2 ≍ f 2 for all f ∈ H.

Problem: In general, it is not true that f =

  • λ∈Λ

f , ψλ ψλ for all f ∈ H. Definition: The frame operator associated to {ψλ}λ∈Λ is defined by S : H → H, Sf =

  • λ∈Λ

f , ψλ ψλ. Theorem: S is a self-adjoint,positive, and invertible operator. Moreover, f =

  • λ∈Λ

f , ψλ ˜ ψλ for all f ∈ H, where { ˜ ψλ := S−1ψλ}λ∈Λ is the associated (canonical) dual frame.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 12 / 32

slide-28
SLIDE 28

Problem with Frames

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Shearlet frame: {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλψλ, where f is a cartoon-like function. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 13 / 32

slide-29
SLIDE 29

Problem with Frames

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Shearlet frame: {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλ ˜ ψλ, where cλ = f , ψλ and f is a cartoon-like function. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 13 / 32

slide-30
SLIDE 30

Problem with Frames

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Shearlet frame: {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλ ˜ ψλ, where cλ = f , ψλ and f is a cartoon-like function. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

˜ ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 13 / 32

slide-31
SLIDE 31

Dualizable Shearlets...

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 14 / 32

slide-32
SLIDE 32

Intuition: Partition of Fourier Domain, shear= 0

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 15 / 32

slide-33
SLIDE 33

Intuition: Partition of Fourier Domain, shear= 0

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 15 / 32

slide-34
SLIDE 34

Intuition: Filters

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 16 / 32

slide-35
SLIDE 35

Shearlet Generators

Let γ ∈ L2(R2) be compactly supported such that, for ρ > 0 fixed, |∂dˆ γ(ξ)| min{1, |ξ1|α} (1 + |ξ1|)β(1 + |ξ2|)β for all d ≤ R with R ≥ 1, α ≥ 1 + 6

ρ, and β > α + 1.

Observation: For each s, {γs

j,m = 2

3 4jγ(AjSs ·−m) : j, m}

and { ˜ γs j,m = 2

3 4j ˜

γ( ˜ AjS∗

s ·−m) : j, m}

form orthonormal bases for L2(R2).

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 17 / 32

slide-36
SLIDE 36

Dualizable Shearlet Frame

For some regularity parameter ρ > 0, define ψj,k,m = Θs ∗ γs

j,m

and ˜ ψj,k,m = ˜ Θs ∗ ˜ γs

j,m

with s = 2−j/2k.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 18 / 32

slide-37
SLIDE 37

Dualizable Shearlet Frame

For some regularity parameter ρ > 0, define ψj,k,m = Θs ∗ γs

j,m

and ˜ ψj,k,m = ˜ Θs ∗ ˜ γs

j,m

with s = 2−j/2k. Theorem (K, Lim; 2014): The dualizable shearlet system SH := {ψj,k,m, ˜ ψj,k,m : j ≥ 0, |k| < 2j/2, m ∈ Z2} forms a compactly supported frame and a dual frame is given by

  • F−1
  • ˆ

ψj,k,m

  • s |ˆ

Θs|2

  • , F−1
  • ˆ

˜ ψj,k,m

  • s |ˆ

˜ Θs|2

  • : ψj,k,m, ˜

ψj,k,m ∈ SH

  • .

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 18 / 32

slide-38
SLIDE 38

Optimal Sparse Approximation inherited!

Theorem (K, Lim; 2014): Let f be a cartoon-like function and let SH = (ψλ)λ∈Λ be as before. Then, for any ρ > 0, there exists a positive constant Cρ such that f − fN2

2 N−2+15ρ · (log(N))2,

where fN is the N term approximation (of the N largest f , ψλ’s) with respect to the dual frame of SH, i.e. fN =

  • λ∈ΛN

f , ψλ ˜ ψλ. Recall: Optimal rate: N−2. Regularity parameter: ρ > 0.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 19 / 32

slide-39
SLIDE 39

General Sampling Strategy

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Dualizable Shearlet frame: {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλ ˜ ψλ, where cλ = f , ψλ and f is a cartoon-like function. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

˜ ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 20 / 32

slide-40
SLIDE 40

Directional Sampling Strategy

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 21 / 32

slide-41
SLIDE 41

Sampling Strategy: Dualizable Shearlet Systems

Recall: We have (k ↔ s) f , ψj,k,m = f , Θs ∗ γs

j,m = Θs ∗ f , γs j,m = cs j,m.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 22 / 32

slide-42
SLIDE 42

Sampling Strategy: Dualizable Shearlet Systems

Recall: We have (k ↔ s) f , ψj,k,m = f , Θs ∗ γs

j,m = Θs ∗ f , γs j,m = cs j,m.

Determining the measurement vector: Θs ∗ f =

  • (j,m)∈Λs

cs

j,mγs j,m

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 22 / 32

slide-43
SLIDE 43

Sampling Strategy: Dualizable Shearlet Systems

Recall: We have (k ↔ s) f , ψj,k,m = f , Θs ∗ γs

j,m = Θs ∗ f , γs j,m = cs j,m.

Determining the measurement vector: Θs ∗ f =

  • (j,m)∈Λs

cs

j,mγs j,m

= ⇒ Θs ∗ f , en =

  • (j,m)∈Λs

γs

j,m, encs j,m

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 22 / 32

slide-44
SLIDE 44

Sampling Strategy: Dualizable Shearlet Systems

Recall: We have (k ↔ s) f , ψj,k,m = f , Θs ∗ γs

j,m = Θs ∗ f , γs j,m = cs j,m.

Determining the measurement vector: Θs ∗ f =

  • (j,m)∈Λs

cs

j,mγs j,m

= ⇒ Θs ∗ f , en =

  • (j,m)∈Λs

γs

j,m, encs j,m

= ⇒ Ps

J(Θs ∗ f ), en =

  • (j,m)∈ΛJ,s

γs

j,m, encs j,m

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 22 / 32

slide-45
SLIDE 45

Sampling Strategy: Dualizable Shearlet Systems

Recall: We have (k ↔ s) f , ψj,k,m = f , Θs ∗ γs

j,m = Θs ∗ f , γs j,m = cs j,m.

Determining the measurement vector: Θs ∗ f =

  • (j,m)∈Λs

cs

j,mγs j,m

= ⇒ Θs ∗ f , en =

  • (j,m)∈Λs

γs

j,m, encs j,m

= ⇒ Ps

J(Θs ∗ f ), en =

  • (j,m)∈ΛJ,s

γs

j,m, encs j,m

Hence, we preliminarily set yn := Ps

J(Θs ∗ f ), en.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 22 / 32

slide-46
SLIDE 46

Sampling Strategy: Dualizable Shearlet Systems

Recall: We have (k ↔ s) f , ψj,k,m = f , Θs ∗ γs

j,m = Θs ∗ f , γs j,m = cs j,m.

Determining the measurement vector: Θs ∗ f =

  • (j,m)∈Λs

cs

j,mγs j,m

= ⇒ Θs ∗ f , en =

  • (j,m)∈Λs

γs

j,m, encs j,m

= ⇒ Ps

J(Θs ∗ f ), en =

  • (j,m)∈ΛJ,s

γs

j,m, encs j,m

Hence, we preliminarily set yn := Ps

J(Θs ∗ f ), en.

Remark: In practice, Ps

J(Θs ∗ f ) ≈ Θs ∗ f , hence yn =

Θs(n) · ˆ f (n).

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 22 / 32

slide-47
SLIDE 47

General Sampling Scheme

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Dualizable Shearlet frame: {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλ ˜ ψλ, where cλ = f , ψλ and f is a cartoon-like function. Reconstruction: − → Reconstruction Algorithm?

  • f , en =
  • λ∈Λ

˜ ψλ, encλ

  • n∈∆

→ (cλ)λ∈Λ.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 23 / 32

slide-48
SLIDE 48

General Sampling Scheme

Fourier measurements: − → Sampling Scheme? f → (f , en)n∈∆. Dualizable Shearlet frame: {ψλ}λ∈Λ. Sparse representation: f =

  • λ∈Λ

cλ ˜ ψλ, where cλ = f , ψλ and f is a cartoon-like function. Reconstruction: (cλ)λ∈Λ = argmin(˜

cλ)λ∈Λ(˜

cλ)λ∈Λ1 s.t.

  • f , en =
  • λ∈Λ

˜ ψλ, en˜ cλ

  • n∈∆.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 23 / 32

slide-49
SLIDE 49

Shear-Adapted Density Sampling

Linear System of Equations: Ps

J(Θs ∗ f ), en =

  • (j,m)∈ΛJ,s

γs

j,m, encs j,m.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 24 / 32

slide-50
SLIDE 50

Shear-Adapted Density Sampling

Linear System of Equations: Ps

J(Θs ∗ f ), en =

  • (j,m)∈ΛJ,s

γs

j,m, encs j,m.

Introducing Randomness: 1

  • ps(ns,ℓ)Ps

J(Θs ∗ f ), ens,ℓ =

  • (j,m)∈ΛJ,s
  • 1
  • ps(ns,ℓ)γs

j,m, ens,ℓ

  • Φs:=

cs

j,m,

where s ∈ SJ/2 := {0} ∪ { q

2j/2 : |q| < 2j/2, q ∈ 2Z + 1, j = 0, . . . , J},

{ns,ℓ : ℓ = 1, . . . , Ls} ⊆ Z2 ∩ [−2J(1+ρ), 2J(1+ρ)]2 is chosen according to a probability density function ps(n) = cs J2(1 + |n1|)(1 + |n2 − sn1|).

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 24 / 32

slide-51
SLIDE 51

Sparse Sampling Strategy

Theorem (K, Lim; 2015): Let f be a cartoon-like function which is C 2,r, r ∈ [1

4, 1) smooth apart

from a C 2-discontinuity curve of non-vanishing curvature. Further, let ρ > 0 be fixed (regularity), J > 0 be ‘sufficiently large’ (limiting scale), ys :=

  • ps(ns,ℓ)

−1Ps J(Θs ∗ f ), ens,ℓ

  • ℓ=1,...,Ls, (measurements),

Φs :=

  • ps(ns,ℓ)

−1γs j,m, ens,ℓ

  • (j,m)∈ΛJ,s,ℓ=1,...,Ls (sampling matrix).

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 25 / 32

slide-52
SLIDE 52

Sparse Sampling Strategy

Theorem (K, Lim; 2015): Let f be a cartoon-like function which is C 2,r, r ∈ [1

4, 1) smooth apart

from a C 2-discontinuity curve of non-vanishing curvature. Further, let ρ > 0 be fixed (regularity), J > 0 be ‘sufficiently large’ (limiting scale), ys :=

  • ps(ns,ℓ)

−1Ps J(Θs ∗ f ), ens,ℓ

  • ℓ=1,...,Ls, (measurements),

Φs :=

  • ps(ns,ℓ)

−1γs j,m, ens,ℓ

  • (j,m)∈ΛJ,s,ℓ=1,...,Ls (sampling matrix).

For each s ∈ SJ/2, (

s∈SJ/2 Ls J2J/2(1+2ρ) =: N)

(ˆ cλ)λ∈ΛJ,s = argmincc1 subject to Φsc = ys,

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 25 / 32

slide-53
SLIDE 53

Sparse Sampling Strategy

Theorem (K, Lim; 2015): Let f be a cartoon-like function which is C 2,r, r ∈ [1

4, 1) smooth apart

from a C 2-discontinuity curve of non-vanishing curvature. Further, let ρ > 0 be fixed (regularity), J > 0 be ‘sufficiently large’ (limiting scale), ys :=

  • ps(ns,ℓ)

−1Ps J(Θs ∗ f ), ens,ℓ

  • ℓ=1,...,Ls, (measurements),

Φs :=

  • ps(ns,ℓ)

−1γs j,m, ens,ℓ

  • (j,m)∈ΛJ,s,ℓ=1,...,Ls (sampling matrix).

For each s ∈ SJ/2, (

s∈SJ/2 Ls J2J/2(1+2ρ) =: N)

(ˆ cλ)λ∈ΛJ,s = argmincc1 subject to Φsc = ys, Then with probability at least 1 − 2−J,

  • f −
  • s∈SJ/2
  • λ∈ΛJ,s

ˆ cλ ˜ ψλ

  • 2

2 2−J(1−13ρ/2)

as J → ∞.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 25 / 32

slide-54
SLIDE 54

Sparse Sampling Strategy

Theorem (K, Lim; 2015): Let f be a cartoon-like function which is C 2,r, r ∈ [1

4, 1) smooth apart

from a C 2-discontinuity curve of non-vanishing curvature. Further, let ρ > 0 be fixed (regularity), J > 0 be ‘sufficiently large’ (limiting scale), ys :=

  • ps(ns,ℓ)

−1Ps J(Θs ∗ f ), ens,ℓ

  • ℓ=1,...,Ls, (measurements),

Φs :=

  • ps(ns,ℓ)

−1γs j,m, ens,ℓ

  • (j,m)∈ΛJ,s,ℓ=1,...,Ls (sampling matrix).

For each s ∈ SJ/2, (

s∈SJ/2 Ls J2J/2(1+2ρ) =: N)

(ˆ cλ)λ∈ΛJ,s = argmincc1 subject to Φsc = ys, Then with probability at least 1 − 2−J,

  • f −
  • s∈SJ/2
  • λ∈ΛJ,s

ˆ cλ ˜ ψλ

  • 2

2 2−J(1−13ρ/2)

as J → ∞.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 25 / 32

slide-55
SLIDE 55

Sparse Sampling Strategy

Theorem (K, Lim; 2015): Let f be a cartoon-like function which is C 2,r, r ∈ [1

4, 1) smooth apart

from a C 2-discontinuity curve of non-vanishing curvature. Further, let ρ > 0 be fixed (regularity), J > 0 be ‘sufficiently large’ (limiting scale), ys :=

  • ps(ns,ℓ)

−1Ps J(Θs ∗ f ), ens,ℓ

  • ℓ=1,...,Ls, (measurements),

Φs :=

  • ps(ns,ℓ)

−1γs j,m, ens,ℓ

  • (j,m)∈ΛJ,s,ℓ=1,...,Ls (sampling matrix).

For each s ∈ SJ/2, (

s∈SJ/2 Ls J2J/2(1+2ρ) =: N)

(ˆ cλ)λ∈ΛJ,s = argmincc1 subject to Φsc = ys, Then with probability at least 1 − 2−J, Asymptotic Optimality!

  • f −
  • s∈SJ/2
  • λ∈ΛJ,s

ˆ cλ ˜ ψλ

  • 2

2 2−J(1−13ρ/2)(= O(N−2+Cρ))

as J → ∞.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 25 / 32

slide-56
SLIDE 56

Numerical Experiments

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 26 / 32

slide-57
SLIDE 57

Sampling Schemes

Directional Sampling Scheme Variable Density Sampling Scheme Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 27 / 32

slide-58
SLIDE 58

Numerical Results for 512x512 MRI Image

Shearlet Scheme (5% sampling rate, 32.2845dB) Wavelets + Directional Sampling (5% sampling rate, 29.8138dB) Original Wavelets + Variable Density Sampling (5% sampling rate, 24.9969dB) Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 28 / 32

slide-59
SLIDE 59

Approximation Curves for 512x512 MRI Image

5 10 15 20 25 20 25 30 35 40 45 sampling rate (%) PSNR (dB) shear08 shear16 shear wave02 wave01 5 10 15 20 25 100 200 300 400 500 600 sampling rate (%) running time (sec) shear08 shear16 shear wave02 wave01

shear08: Directional sampling scheme with 8 directional filters. shear16: Directional sampling scheme with 16 directional filters. shear: Directional sampling scheme with (normal) shearlets. wave02: Directional sampling scheme with wavelets. wave01: Variable density sampling scheme with wavelets.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 29 / 32

slide-60
SLIDE 60

Let’s conclude...

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 30 / 32

slide-61
SLIDE 61

What to take Home...?

Sampling and reconstruction strategies are key for acquiring data in a continuous world. Applications such as MRI only allow Fourier samples. Sampling and Reconstruction Scheme for Fourier Data:

◮ Cartoon-like functions as continuum model. ◮ (Dualizable) Shearlets as sparsifying system. ◮ Directional sampling scheme.

Asymptotically optimal recovery could be proven. Numerical evidence of superiority

  • f the scheme.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 31 / 32

slide-62
SLIDE 62

Technische Universität Berlin Applied Functional Analysis Group

THANK YOU!

References available at:

www.math.tu-berlin.de/∼kutyniok

Code available at:

www.ShearLab.org

Related Books:

  • Y. Eldar and G. Kutyniok

Compressed Sensing: Theory and Applications Cambridge University Press, 2012.

  • G. Kutyniok and D. Labate

Shearlets: Multiscale Analysis for Multivariate Data Birkh¨ auser-Springer, 2012.

Gitta Kutyniok (TU Berlin) Lecture 3 Winter School 2015 32 / 32