Fractional-Tikhonov regularization on graphs (applied to signal and - - PowerPoint PPT Presentation

fractional tikhonov regularization on graphs
SMART_READER_LITE
LIVE PREVIEW

Fractional-Tikhonov regularization on graphs (applied to signal and - - PowerPoint PPT Presentation

Fractional-Tikhonov regularization on graphs (applied to signal and image restoration) by Davide Bianchi Universit` a degli Studi dellInsubria Dip. di Scienze e Alta Tecnologia 23 rd of May, 2018 1 of 22 Our model problem y = x


slide-1
SLIDE 1

Fractional-Tikhonov regularization on graphs

(applied to signal and image restoration) by Davide Bianchi

Universit` a degli Studi dell’Insubria

  • Dip. di Scienze e Alta Tecnologia

23rd of May, 2018

1 of 22

slide-2
SLIDE 2

Our model problem

yδ = K ∗ x + noise

  • K represents the blur and it is severely ill-conditioned (compact

integral operator of the first kind);

  • yδ are known measured data (blurred and noisy image);
  • noise ≤ δ.

2 of 22

slide-3
SLIDE 3

Singular value expansion and generalized inverse

Since K is compact, we can write Kx =

+∞

  • m=1

σmx, vmum, where (σm; vm, um)m∈N is the singular value expansion of K.

Generalized inverse

We define K† : D(K†) ⊆ Y → X as K†y =

  • m: σm>0

σ−1

m y, umvm,

D(K†) =

  • y ∈ Y :
  • m: σm>0

σ−2

m |y, um|2 < ∞

  • .

3 of 22

slide-4
SLIDE 4

In the free-noise case, we have x† = K†y, but due to the ill-posedness of the problem, xδ = K†yδ is not a good approximation of x†. Since we are dealing with data affected by noise, i.e., with yδ, then we can not use K† to compute an approximated solution. We have to regularize the operator K†.

4 of 22

slide-5
SLIDE 5

Filter based regularization methods

We substitute the K† operator with a one-parameter family of continuous linear operators {Rα}α∈(0,α0), K†yδ =

  • m: σm>0

σ−1

m yδ, umvm

⇓ Rαyδ =

  • m: σm>0

Fα(σm)σ−1

m yδ, umvm

α = α(δ, yδ) is called rule choice.

5 of 22

slide-6
SLIDE 6

Fractional Tikhonov filter functions

  • Standard Tikhonov filter: Fα(σm) =

σ2

m

σ2

m + α, with α > 0.

  • Fractional Tikhonov filter: Fα,γ(σm) =
  • σ2

m

σ2

m + α

γ , with α > 0 and γ ∈ [1/2, ∞) (Klann and Ramnlau, 2008).

  • Weighted/Fractional Tikhonov filter: Fα,r(σm) =

σr+1

m

σr+1

m

+ α, with α > 0 and r ∈ [0, +∞) (Hochstenbach and Reichel, 2011). For 1/2 ≤ γ < 1 and 0 ≤ r < 1, fractional and weighted filters smooth the reconstructed solution less than standard Tikhonov.

6 of 22

slide-7
SLIDE 7

An easy 1d example of oversmoothing - part 1

Blur taken from Heat(n, κ) in Regtools, n = 100, κ = 1 and 2%

  • noise. True solution:

x† : [0, 1] → R s.t. x†(t) =

  • if 0 ≤ t ≤ 0.5,

1 if 0.5 < t ≤ 1.

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.2

0.2 0.4 0.6 0.8 1 1.2

true solution Tik F- Tik, r=0.35 blur+2% noise F-Tik, r=3 7 of 22

slide-8
SLIDE 8

Let’s reformulate the problem

  • Tikhonov: argmin

x∈Rn Kx − y2 2 + αx2 2

8 of 22

slide-9
SLIDE 9

Let’s reformulate the problem

  • Tikhonov: argmin

x∈Rn Kx − y2 2 + αx2 2

  • F. Tikhonov: argmin

x∈Rn Kx − y2 W + αx2 2, with W = (KK∗)

r−1 2 . 8 of 22

slide-10
SLIDE 10

Let’s reformulate the problem

  • Tikhonov: argmin

x∈Rn Kx − y2 2 + αx2 2

  • F. Tikhonov: argmin

x∈Rn Kx − y2 W + αx2 2, with W = (KK∗)

r−1 2 .

  • Generalized Tikhonov: argmin

x∈Rn Kx − y2 2 + αLx2 2, with L

semi-positive definite and ker(L) ∩ ker(K) =

  • 0. ker(L) should

‘approximate the features ’of x†.

8 of 22

slide-11
SLIDE 11

Let’s reformulate the problem

  • Tikhonov: argmin

x∈Rn Kx − y2 2 + αx2 2

  • F. Tikhonov: argmin

x∈Rn Kx − y2 W + αx2 2, with W = (KK∗)

r−1 2 .

  • Generalized Tikhonov: argmin

x∈Rn Kx − y2 2 + αLx2 2, with L

semi-positive definite and ker(L) ∩ ker(K) =

  • 0. ker(L) should

‘approximate the features ’of x†.

  • Generalized F. Tikhonov: argmin

x∈Rn Kx − y2 W + αLx2 2

8 of 22

slide-12
SLIDE 12

Laplacian - Finite Difference approximation

Poisson (Sturm-Liouville) problem on [0, 1]:      −∆x(t) = f(t) t ∈ (0, 1), α1x(0) + β1x′(0) = γ1, α2x(1) + β2x′(1) = γ2. If we consider Dirichlet homogeneous boundary conditions (x(0) = x(1) = 0) and 3-point stencil FD approximation: −∆x(t) ≈ −x(t − h) + 2x(t) − x(t + h) h2 , h2 = n−2, L =      2 −1 · · · −1 2 −1 · · · ... ... ... −1 2      ker(L) = 0.

9 of 22

slide-13
SLIDE 13

Laplacian - Finite Difference approximation

· · · If we consider Neumann homogeneous boundary conditions (x′(0) = x′(1) = 0) and 3-point stencil FD approximation: −∆x(t) ≈ −x(t − h) + 2x(t) − x(t + h) h2 , h2 = n−2, L =      1 −1 · · · −1 2 −1 · · · ... ... ... −1 1      ker(L) = Span{ 1}.

10 of 22

slide-14
SLIDE 14

An easy 1d example of oversmoothing - part 2

Blur taken from Heat(n, κ) in Regtools, n = 100, κ = 1 and 2%

  • noise. True solution:

x† : [0, 1] → R s.t. x†(t) =

  • if 0 ≤ t ≤ 0.5,

1 if 0.5 < t ≤ 1.

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.2

0.2 0.4 0.6 0.8 1 1.2

True solution Tik + L dirichlet Tik + L neumann 11 of 22

slide-15
SLIDE 15

Graph Laplacian

  • An image/signal x can be represented by a weighted undirected

graph G = (V, E, w):

  • the nodes vi ∈ V are the pixels of the image/signal and xi ≥ 0 is the

color intensity of x at vi.

  • an edge ei,j ∈ E ⊆ V × V exists if the pixels vi and vj are connected,

i.e., vi ∼ vj.

  • w : E → R is a similarity (positive) weight function, w(ei,j) = wi,j.
  • The graph Laplacian is defined as ∆(n)

w xi =

  • vj∼vi

wi,j (xi − xj).

Remark

  • ([0,1],µ)

x′′(t)φ(t) dµ(t) =

  • ([0,1],µ)

x(t)φ′′(t) dµ(t)

12 of 22

slide-16
SLIDE 16

Graph Laplacian - Example

  • Example. In the 1d case, if we define

vi ∼ vj iff i = j + 1 or i = j − 1, wi,j =

  • 1

if i = j, if i = j, then it holds ∆(n)

w

= L(n)

w

=      1 −1 · · · −1 2 −1 · · · ... ... ... −1 1     

13 of 22

slide-17
SLIDE 17

Question

Why should the red points be connected?

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.2

0.2 0.4 0.6 0.8 1 1.2

14 of 22

slide-18
SLIDE 18

Answer

They should not, indeed

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.2

0.2 0.4 0.6 0.8 1 1.2

true Tik + graph

L(n)

w

=

  • L(n/2)

w

L(n/2)

w

  • 15 of 22
slide-19
SLIDE 19

Fractional Tikhonov + Graph Laplacian

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.2

0.2 0.4 0.6 0.8 1 1.2

True

  • F. Tik. + graph, r=4

16 of 22

slide-20
SLIDE 20

Remark 1/2

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.2

0.2 0.4 0.6 0.8 1 1.2

True

  • Tik. + graph, 1 p. con.

17 of 22

slide-21
SLIDE 21

Remark 2/2

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.2

0.2 0.4 0.6 0.8 1 1.2

True

  • Tik. + graph, 5 p.con.

18 of 22

slide-22
SLIDE 22

another example - heat(n, 1) 5% noise

L(n)

w

= diag

  • L(n/4)

w

, L(n/4)

w

, L(n/4)

w

, L(n/4)

w

  • 19 of 22
slide-23
SLIDE 23

another another example - deriv2(n, 3), 2% noise

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.01

0.01 0.02 0.03 0.04 0.05 0.06

True Tik.

  • F. Tik. + graph

L(n)

w

=

  • L(n/2)

w

L(n/2)

w

  • L(n/2)

w

=        · · · · · · −1 2 −1 · · · ... ... ... −1 2 −1 · · · · · ·        ker(L(n/2)

w

) = Span{ 1, t}

20 of 22

slide-24
SLIDE 24

(another)3 example - heat(n, 1) 2% noise

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

  • 0.5

0.5 1 1.5 2 2.5 3

True Tik.

  • F. Tik. + graph

21 of 22

slide-25
SLIDE 25

Some references

  • Shuman, D. I., Narang, S. K., Frossard, P., Ortega, A., and

Vandergheynst, P., The emerging field of signal processing on graphs: Extending high-dimensional data analysis to networks and

  • ther irregular domains, IEEE Signal Processing Magazine, 30(3),

83-98 (2013).

  • Faber, X. W. C., Spectral convergence of the discrete Laplacian on

models of a metrized graph, New York J. Math, 12, 97-121 (2016).

  • Bianchi, D., and Donatelli, M., On generalized iterated Tikhonov

regularization with operator-dependent seminorms, Electronic Transactions on Numerical Analysis, 47, 73-99 (2017).

  • Gerth, D., Klann, E., Ramlau, R., and Reichel, L., On fractional

Tikhonov regularization. Journal of Inverse and Ill-posed Problems, 23(6), 611-625 (2008).

  • Bianchi, D., and Donatelli, M., Fractional-Tikhonov regularization
  • n graphs for image restoration, preprint.

22 of 22