Lecture 2: Applied Harmonic Analysis and Compressed Sensing Gitta - - PowerPoint PPT Presentation

lecture 2 applied harmonic analysis and compressed sensing
SMART_READER_LITE
LIVE PREVIEW

Lecture 2: Applied Harmonic Analysis and Compressed Sensing Gitta - - PowerPoint PPT Presentation

Lecture 2: Applied Harmonic Analysis and Compressed Sensing Gitta Kutyniok (Technische Universit at Berlin) Winter School on Compressed Sensing, TU Berlin December 35, 2015 Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015


slide-1
SLIDE 1

Lecture 2: Applied Harmonic Analysis and Compressed Sensing

Gitta Kutyniok

(Technische Universit¨ at Berlin)

Winter School on “Compressed Sensing”, TU Berlin December 3–5, 2015

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 1 / 30

slide-2
SLIDE 2

Challenge in Data Analysis

Recovery of missing data is a common problem in many

  • applications. In imaging science, this is termed

the inpainting problem. Examples include... Removal of scratches on old photos. Removal of overlaid text or graphics. Missing sensors for measuring seismic data. Filling-in missing blocks in unreliably transmitted images. ...

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 2 / 30

slide-3
SLIDE 3

Inpainting, I

(Source: Elad, Starck, Querre, and Donoho; 2005) Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 3 / 30

slide-4
SLIDE 4

Inpainting, II

(Source: Hennenfent and Herrmann; 2008) Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 4 / 30

slide-5
SLIDE 5

Compressed Sensing Approach to Inpainting

Main Philosophy: Original image is sparsified by a particular representation system. Optimize the sparsity of the reconstructed image. (Incomplete) List of Contributors: Donoho, Elad, Querre, and Starck (2005) Cai, Chan, Dong, Fadili, Gribonval, Hennenfent, Herrmann, Li, Setzer, Shen, Steidl, Ye, Xu,... Another Path are Variational Approaches: Propagate information from the boundaries & guarantee smoothness. Contributors: Bornemann, Chan, Esedoglu, Kang, Osher, Sapiro, Setzer, Shen, Steidl, Vese, ...

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 5 / 30

slide-6
SLIDE 6

What is the Relation of Compressed Sensing and Inpainting?

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 6 / 30

slide-7
SLIDE 7

‘Mathematical Model’

Model for missing data: Given a signal x = xK + xM ∈ HK ⊕ HM. Recover x, if only xK is known.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 7 / 30

slide-8
SLIDE 8

‘Mathematical Model’

Model for missing data: Given a signal x = xK + xM ∈ HK ⊕ HM. Recover x, if only xK is known. Isn’t this impossible? xM lives in an orthogonal space.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 7 / 30

slide-9
SLIDE 9

‘Mathematical Model’

Model for missing data: Given a signal x = xK + xM ∈ HK ⊕ HM. Recover x, if only xK is known. Isn’t this impossible? xM lives in an orthogonal space. But we have additional information: x and xK are morphologically different.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 7 / 30

slide-10
SLIDE 10

Underdetermined Problem!

Main Idea: Let x ∈ H be a signal. Φ be an ONB, and set x = Φc. H = HK ⊕ HM with orthogonal projections PK and PM. Task: Given y = PKx ∈ HK, solve y = PK ˜ x for ˜ x ∈ H. Problem: This is an underdetermined problem!

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 8 / 30

slide-11
SLIDE 11

Compressed Sensing enters the Picture!

Solve y = Ax, A = (ai)i ∈ Rn×m with n << m, if x is sparse, i.e., x0 = #{i : xi = 0} is small. A is incoherent, e.g., maxi=j |ai, aj| is small. Then x can be recovered from y by Convex optimization: min˜

x ˜

x1 s.t. y = A˜ x. Greedy algorithms: maximize |y, ai| etc.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 9 / 30

slide-12
SLIDE 12

Compressed Sensing enters the Picture!

Solve y = Ax, A = (ai)i ∈ Rn×m with n << m, if x is sparse, i.e., x0 = #{i : xi = 0} is small. A is incoherent, e.g., maxi=j |ai, aj| is small. Then x can be recovered from y by Convex optimization: min˜

x ˜

x1 s.t. y = A˜ x. Greedy algorithms: maximize |y, ai| etc. Theorem (Bruckstein, Elad; 2002)(Donoho, Elad; 2003) Let A ∈ Rn×m with normalized columns, n << m, and let x satisfy x0 < 1 2

  • 1 + µ(A)−1

, with coherence µ(A) = maxi=j |ai, aj|. For y = Ax, we then have x = argmin˜

x1 subject to y = A˜ x.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 9 / 30

slide-13
SLIDE 13

Birth of Inpainting via Compressed Sensing

Situation: Let x ∈ H be a signal. Φ be an ONB such that c with x = Φc is sparse. H = HK ⊕ HM with orthogonal projections PK and PM. ℓ1 Minimization Problem (Elad, Starck, Querre, Donoho; 2005): ˆ c = argmin˜

c1 s.t. PKx = PKΦ˜ c

  • ˆ

x = Φˆ c

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 10 / 30

slide-14
SLIDE 14

Birth of Inpainting via Compressed Sensing

Situation: Let x ∈ H be a signal. Φ be an ONB such that c with x = Φc is sparse. H = HK ⊕ HM with orthogonal projections PK and PM. ℓ1 Minimization Problem (Elad, Starck, Querre, Donoho; 2005): ˆ c = argmin˜

c1 s.t. PKx = PKΦ˜ c

  • ˆ

x = Φˆ c Theorem (Donoho, Elad; 2003) If c0 < 1

2(1 + µ(PKΦ)−1) (µ(A) = maxi=j |ai, aj|), then

x = Φˆ c. Morphological Difference: µ(PKΦ) = µ([ Φ | PKΦ ])

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 10 / 30

slide-15
SLIDE 15

Let’s go back to Imaging Science...

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 11 / 30

slide-16
SLIDE 16

Back to Imaging Science...

Main steps: Original image governed by curvilinear structures:

?

Choose suitable representation system which optimally sparsely approximates curvelike structures Shearlets. Minimize the ℓ1 norm of the coefficients. The original image will be recovered as the sparsest possible continuation of the known part. Question: What is a good model for an image?

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 12 / 30

slide-17
SLIDE 17

Fitting Model for Images

Definition (Donoho; 2001): The set of cartoon-like images E2(R2) is defined by E2(R2) = {f ∈ L2(R2) : f = f0 + f1 · χB}, where B ⊂ [0, 1]2 with ∂B a closed C 2-curve, f0, f1 ∈ C 2

0 ([0, 1]2).

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 13 / 30

slide-18
SLIDE 18

Fitting Model for Images

Definition (Donoho; 2001): The set of cartoon-like images E2(R2) is defined by E2(R2) = {f ∈ L2(R2) : f = f0 + f1 · χB}, where B ⊂ [0, 1]2 with ∂B a closed C 2-curve, f0, f1 ∈ C 2

0 ([0, 1]2).

Theorem (Donoho; 2001): Let (ψλ)λ ⊆ L2(R2). Allowing only polynomial depth search, the optimal asymptotic approximation error of f ∈ E2(R2) is f − fN2

2 ≍ N−2,

N → ∞, where fN =

  • λ∈IN

cλψλ.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 13 / 30

slide-19
SLIDE 19

(Cone-adapted) Discrete Shearlet Systems

Definition (K, Labate; 2006): The (cone-adapted) discrete shearlet system SH(c; φ, ψ, ˜ ψ), c > 0, generated by φ ∈ L2(R2) and ψ, ˜ ψ ∈ L2(R2) is the union of {φ(· − m) : m ∈ Z2}, {23j/4ψ(SkA2j · −m) : j ≥ 0, |k| ≤ ⌈2j/2⌉, m ∈ Z2}, {23j/4 ˜ ψ( ˜ Sk ˜ A2j · −m) : j ≥ 0, |k| ≤ ⌈2j/2⌉, m ∈ Z2}. Theorem (K, Lim; 2011): Let φ, ψ, ˜ ψ ∈ L2(R2) be compactly supported, and let ˆ ψ, ˆ ˜ ψ satisfy certain decay condition. Then SH(φ, ψ, ˜ ψ) provides an optimally sparse approximation of f ∈ E2(R2), i.e., f − fN2

2 ≤ C · N−2 · (log N)3,

N → ∞.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 14 / 30

slide-20
SLIDE 20

Analysis of Shearlet-Based Inpainting...

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 15 / 30

slide-21
SLIDE 21

Numerical Results of Inpainting, I

(Source: Lim; 2014) Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 16 / 30

slide-22
SLIDE 22

Numerical Results of Inpainting, II

Undersampled seismic data Reconstructed image

(Source: K, Lim; 2012) Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 17 / 30

slide-23
SLIDE 23

Numerical Results for Inpainting, III

Original Noisy Version (80% missing) Curvelets (29.95dB, 182.15sec) Shearlets (31.04dB, 85.18sec)

(Source: Lim; 2012) Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 18 / 30

slide-24
SLIDE 24

Microlocal Model

Masked Seismic image: Curvilinear Singularity: C = ρ

−ρ

w(t)δτ(t)dt, τ : [−1, 1] → R2 a C 2-curve, ρ < 1, and w : [−ρ, ρ] → R+

0 ‘bump’.

Mask: Mh = {(x1, x2) ∈ R2 : |x1| ≤ h}, h > 0. Observed Signal: f = 1R2\Mh · C.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 19 / 30

slide-25
SLIDE 25

Scale-Dependent Decomposition

Observed Object: f = 1R2\Mh · C. Subband Decomposition: C → Cj = C ⋆ Fj. Question: Shall we consider 1R2\Mh · Cj?

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 20 / 30

slide-26
SLIDE 26

Size of the Mask

Intuition: Inpainting at scale j: Inpainting seems possible provided h = o(2−j/2) as j → ∞. Asymptotic analysis: Consider h = hj. Set fj = 1R2\Mhj · Cj.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 21 / 30

slide-27
SLIDE 27

Size of the Mask

Intuition: Inpainting at scale j: Inpainting seems possible provided h = o(2−j/2) as j → ∞. Asymptotic analysis: Consider h = hj. Set fj = 1R2\Mhj · Cj. Shearlet-Inpainting: Sj = argmin ˜

Sj( ˜

Sj, ση)η1 s.t. fj = 1R2\Mhj · ˜ Sj

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 21 / 30

slide-28
SLIDE 28

Asymptotic Inpainting

Theorem (King, K, Zhuang; 2014) For hj = o(2−j/2) as j → ∞, Sj − Cj2 Cj2 → 0, j → ∞.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 22 / 30

slide-29
SLIDE 29

Analysis of Inpainting within one Scale

Signal Model: x = PKx + PMx ∈ HK ⊕ HM, where PK(PM) is the orthogonal projection onto HK(HM). Remarks: Let Φ be an ONB for H such that x = Φc with c sparse. x = Φ(ΦTx) with ΦTx sparse. Norm is placed on synthesis side. Inpainting Technique: c⋆ = argmin˜

c1 subject to PKΦ˜ c = PKx. x⋆ = argmin˜

xΦT ˜

x1 subject to PK ˜ x = PKx.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 23 / 30

slide-30
SLIDE 30

Analysis of Inpainting within one Scale

Signal Model: x = PKx + PMx ∈ HK ⊕ HM, where PK(PM) is the orthogonal projection onto HK(HM). Remarks: Let Φ be an ONB tight frames (ΦΦT = Id) for H such that x = Φc with c sparse. x = Φ(ΦTx) with ΦTx sparse. Norm is placed on synthesis side. Inpainting Technique: c⋆ = argmin˜

c1 subject to PKΦ˜ c = PKx. x⋆ = argmin˜

xΦT ˜

x1 subject to PK ˜ x = PKx.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 23 / 30

slide-31
SLIDE 31

Analysis of Inpainting within one Scale

Signal Model: x = PKx + PMx ∈ HK ⊕ HM, where PK(PM) is the orthogonal projection onto HK(HM). Remarks: Let Φ be an ONB tight frames (ΦΦT = Id) for H such that x = Φc with c sparse. x = Φ(ΦTx) with ΦTx sparse. Norm is placed on analysis side. Inpainting Technique: c⋆ = argmin˜

c1 subject to PKΦ˜ c = PKx. x⋆ = argmin˜

xΦT ˜

x1 subject to PK ˜ x = PKx.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 23 / 30

slide-32
SLIDE 32

Central Estimate

Definition: Let Φ = (ϕi)i∈I. x0 is δ-relatively sparse in Φ with respect to Λ ⊆ I, if 1ΛcΦTx01 ≤ δ. We call Λ a set of significant coefficients. We define the cluster coherence for Λ and HM by µc = µc(Φ, Λ, HM) = max

j∈I

  • i∈Λ

|PMϕi, ϕj|. Theorem (King, K, Zhuang; 2014) If x is δ-relatively sparse in Φ with respect to Λ, then x⋆ − x2 ≤ 2δ 1 − 2µc .

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 24 / 30

slide-33
SLIDE 33

Application of Previous Result

x: Filtered signal Cj. H = HK ⊕ HM: Space L2(R2) = L2(R2\Mhj) ⊕ L2(Mhj). Φ: Shearlets filtered with Fj. Λ: Significant shearlet coefficients of Cj. δ: Degree of approximation by significant coefficients. µc: Cluster coherence between masked shearlets. Estimate of error:

2δ 1−2µc .

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 25 / 30

slide-34
SLIDE 34

Application of Previous Result

x: Filtered signal Cj. H = HK ⊕ HM: Space L2(R2) = L2(R2\Mhj) ⊕ L2(Mhj). Φ: Shearlets filtered with Fj. Λ: Significant shearlet coefficients of Cj. δ: Degree of approximation by significant coefficients. µc: Cluster coherence between masked shearlets. Estimate of error:

2δ 1−2µc .

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 25 / 30

slide-35
SLIDE 35

Asymptotic Inpainting

Application of the abstract estimate for inpainting then implies: Theorem (King, K, Zhuang; 2014) For hj = o(2−j) as j → ∞, Wj − Cj2 Cj2 → 0, j → ∞.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 26 / 30

slide-36
SLIDE 36

Asymptotic Inpainting

Application of the abstract estimate for inpainting then implies: Theorem (King, K, Zhuang; 2014) For hj = o(2−j) as j → ∞, Wj − Cj2 Cj2 → 0, j → ∞. Extension to universal shearlets (Genzel, K; 2015)

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 26 / 30

slide-37
SLIDE 37

What about Wavelets?

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 27 / 30

slide-38
SLIDE 38

Empirical Results using Wavelets and Shearlets

Wavelet Inpainting Shearlet Inpainting Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 28 / 30

slide-39
SLIDE 39

Shearlets versus Wavelets

Theorem (King, K, Zhuang; 2014) For hj = o(2−j) as j → ∞, Wj − Cj2 Cj2 → 0, j → ∞.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 29 / 30

slide-40
SLIDE 40

Shearlets versus Wavelets

Theorem (King, K, Zhuang; 2014) For hj = o(2−j) as j → ∞, Wj − Cj2 Cj2 → 0, j → ∞. Theorem (King, K, Zhuang; 2014) Using the inpainting strategy

1

Band-pass filtering,

2

Hard thresholding with scale dependent threshold, for hj = ω(2−j) as j → ∞, we obtain Wj − Cj2 Cj2 → 0, j → ∞.

Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 29 / 30

slide-41
SLIDE 41

What to take Home...?

Inpainting can be achieved by sparse approximation and ℓ1-minimization. Shearlets are the optimal choice if the original image is governed by anisotropic features. Key features of our asymptotic optimality result:

◮ Continuum model. ◮ Geometrically clustered sparsity and cluster coherence. ◮ Microlocal analysis viewpoint. Gitta Kutyniok (TU Berlin) Lecture 2 Winter School 2015 30 / 30