Kernel-based Image Reconstruction from scattered Radon data by - - PowerPoint PPT Presentation

kernel based image reconstruction from scattered radon
SMART_READER_LITE
LIVE PREVIEW

Kernel-based Image Reconstruction from scattered Radon data by - - PowerPoint PPT Presentation

Kernel-based Image Reconstruction from scattered Radon data by (anisotropic) positive definite functions Stefano De Marchi 1 February 5, 2016 1Joint work with A. Iske (Hamburg, D), A. Sironi (Lousanne, CH) and G. Santin (Stuttgart, D) Main


slide-1
SLIDE 1

Kernel-based Image Reconstruction from scattered Radon data by (anisotropic) positive definite functions

Stefano De Marchi 1 February 5, 2016

1Joint work with A. Iske (Hamburg, D), A. Sironi (Lousanne, CH) and G. Santin (Stuttgart, D)

slide-2
SLIDE 2

Main references

1

  • S. De Marchi, A. Iske and A. Sironi, Kernel-based Image Reconstruction from Scattered Radon Data by Positive

Definite Functions, submitted 2013 (download at http://www.math.unipd.it/∼demarchi/papers/Kernel based image reconstruction.pdf)

2

  • S. De Marchi, A. Iske and G. Santin, Kernel-based Image Reconstruction from scattered Radon data by

anisotropic positive definite functions, Draft 2016

3

  • T. G. Feeman, The mathematics of medical imaging: a beginners guide , Springer 2010.

4

  • A. Iske, Reconstruction of functions from generalized Hermite-Birkhoff data. In: Approximation Theory VIII, Vol. 1:

Approximation and Interpolation, C.K. Chui and L.L. Schumaker (eds.), World Scientific, Singapore, 1995, 257–264.

5

  • F. Natterer: The Mathematics of Computerized Tomography. Classics in Applied Mathematics, vol. 32. SIAM,

Philadelphia, 2001

6

Amos Sironi, Medical image reconstruction using kernel-based methods, Master’s thesis, University of Padua, 2011, arXiv:1111.5844v1.

7

Davide Poggiali, Reconstruction of medical images from Radon data in trasmission and emission tomography, Master’s thesis, University of Padua, 2012.

8

Maria Angela Narduzzo, A kernel method for CT reconstruction: a fast implementation using circulant matrices, Master’s thesis, University of Padua, Dec. 2013.

9

Silvia Guglielmo, Stable kernel based methods for medical image reconstruction, Master’s thesis, University of Padua, Dec. 2013.

2 of 69

slide-3
SLIDE 3

Part I The problem and the first approach

Work with A. Iske, A. Sironi

3 of 69

slide-4
SLIDE 4

Outline

1

Image Reconstruction from CT

2

Radon transform

3

  • Alg. Rec. Tech. (ART), Kernel approach

Regularization Numerical results

4 of 69

slide-5
SLIDE 5

Description of CT

How does it work?

Non-invasive medical procedure (X-ray equipment). X-ray beam is assumed to be:

  • monochromatic;
  • zero-wide;
  • not subject to diffraction or refraction.

Produce cross-sectional images. Transmission tomography (emissive tomography, like PET and SPECT, are not considererd here)

5 of 69

slide-6
SLIDE 6

Description of CT

How does it work?

ℓ(t,θ) −→ line along which the X-ray is moving; (t, θ) ∈ R × [0, π) −→ polar coordinates of line-points;

f −→ attenuation coefficient of the body; I −→ intensity of the X-ray.

6 of 69

slide-7
SLIDE 7

X-rays

Discovered by Wihelm Conrad R¨

  • ntgen in 1895

Wavelength in the range [0.01, 10] × 10−9 m Attenuation coefficient: A(x) ≈ ”#pho.s absorbed/1 mm” A : Ω → [0, ∞) Figure: First X-ray image: Frau R¨

  • ntgen left hand.

7 of 69

slide-8
SLIDE 8

CT machine and people

Computerized Tomography (CT)

modern CT Allan Mcleod Cormack Godfrey Newbold Hounsfield

both got Nobel Price for Medicine and Physiology in 1979

8 of 69

slide-9
SLIDE 9

Computerized Axial Tomography

Figure: First generation of CT scanner design.

  • A. Cormack and G.

Hounsfield 1970 Reconstruction from X-ray images taken from 160 or more beams at each of 180 directions Beer’s law (loss of intensity):

x1

x0

A(x) dx = ln

I0

I1

  • given by CT

9 of 69

slide-10
SLIDE 10

Outline

1

Image Reconstruction from CT

2

Radon transform

3

  • Alg. Rec. Tech. (ART), Kernel approach

Regularization Numerical results

10 of 69

slide-11
SLIDE 11

Lines in the plane

A line l in the plane, perpendicular to the unit vector nθ = (cos θ, sin θ) and passing through the point p = (t cos θ, t sin θ) = tnθ, can be characterized (by the polar coordinates t ∈ R, θ ∈ [0, π)), i.e. l = lt,θ lt,θ = {x := (t cos θ − s sin θ, t sin θ + s cos θ) = (x1(s), x2(s)) s ∈ R}

t,θ

Figure: A line in the plane.

11 of 69

slide-12
SLIDE 12

Radon transform

definition

The Radon transform of a given function f : Ω ⊂ R2 → R is defined for each pair of real number (t, θ), as line integral Rf(t, θ) =

  • lt,θ

f(x)dx =

  • R

f(x1(s), x2(s)) ds

t

Figure: Left: image. Right: its Radon transform (sinogram)

12 of 69

slide-13
SLIDE 13

Radon tranform

Image reconstruction

A CT scan measures the X-ray projections through the object, producing a sinogram, which is effectively the Radon transform of the attenuation coefficient function f in the (t, θ)-plane.

13 of 69

slide-14
SLIDE 14

Radon transform: another example

Figure: Shepp-Logan phantom. Figure: Radon transform (sinogram).

14 of 69

slide-15
SLIDE 15

Back projection

First attempt to recover f from Rf The back projection of the function h ≡ h(t, θ) is the transform Bh(x) = 1

π π

h(x1 cos θ + x2 sin θ, θ) dθ i.e. the average of h over the angular variable θ, where t = x1 cos θ + x2 sin θ = xTnθ.

Figure: Back projection of the Radon transform.

15 of 69

slide-16
SLIDE 16

Important theorems

Theorem (Central Slice Theorem)

For any suitable function f defined on the plane and all real numbers r, θ F2f(r cos θ, r sin θ) = F(Rf)(r, θ). (F2 and F are the 2-d and 1-d Fourier transforms, resp.).

Theorem (The Filtered Back-Projection Formula)

For a suitable function f defined in the plane f(x) = 1 2B{F−1[|r|F (Rf)(r, θ))]}(x) , x ∈ R2.

16 of 69

slide-17
SLIDE 17

Fundamental question

Fundamental question of image reconstruction.

Is it possible to reconstruct a function f starting from its Radon transform Rf?

  • Answer (Radon 1917).

Yes, we can if we know the value of the Radon transform for all r, θ.

  • 17 of 69
slide-18
SLIDE 18

Discrete problem

Ideal case Rf(t, θ) known for all t, θ Infinite precision No noise Real case Rf(t, θ) known only on a finite set {(tj, θk)}j,k Finite precision Noise in the data

18 of 69

slide-19
SLIDE 19

Fourier-based approach

Sampling: Rf(t, θ) → RDf(jd, kπ/N) Discrete transform: e.g. BDh(x) = 1 N

N−1

  • k=0

h(x cos (kπ/N) + y sin (kπ/N), kπ/N) Filtering (low-pass): |r| = Fφ(r), with φ band-limited function Interpolation: {fk : k ∈ N} → If(x), x ∈ R

19 of 69

slide-20
SLIDE 20

Discrete problem

Filtered Back-Projection Formula f(x) = 1 2B{F−1[|r| · F(Rf(r, θ))]}(x) Filtering f(x) = 1 2B{F−1[F(φ(r)) · F(Rf(r, θ))]}(x) =

= 1

2B{F−1[F(φ ∗ Rf(r, θ))]}(x)

= 1

2B[φ ∗ Rf(r, θ)](x) Sampling and interpolation f(xm

1 , xn 2) = 1

2BDI[φ ∗ RDf(rj, θk)](xm

1 , xn 2) 20 of 69

slide-21
SLIDE 21

Discrete problem: an example

Figure: Shepp-Logan phantom. Figure: Reconstruction with linear interpolation and 180x101 = 18180 samples.

21 of 69

slide-22
SLIDE 22

Outline

1

Image Reconstruction from CT

2

Radon transform

3

  • Alg. Rec. Tech. (ART), Kernel approach

Regularization Numerical results

22 of 69

slide-23
SLIDE 23

Algebraic Reconstruction Techniques (ART)

Differently from Fourier-based reconstruction, we consider G = span{gj, j = 1, ..., n} of n basis functions and we solve the reconstruction problem on all Radom lines L RL(g) = RL(f) by using g =

n

  • j=1

cjgj . Asking interpolation, that is Rg(tk, θk) = Rf(tk, θk), k = 1, . . . , m we obtain the linear system Ac = b with Ak,j = Rgj(tk, θk), k = 1, . . . , m, j = 1, . . . , n . Large, often sparse, linear system Solution by iterative methods (Kaczmarz, MLEM, OSEM, LSCG), or SIRT techniques (see AIRtools by Hansen &Hansen 2012).

23 of 69

slide-24
SLIDE 24

ART reconstruction: Example 1

Figure: Bull’s eye phantom. Figure: 64 × 64 = 4096 reconstructed image with 4050 samples by Kaczmarz.

24 of 69

slide-25
SLIDE 25

ART reconstruction: Example 2

Figure: Shepp-Logan phantom. Figure: The phantom reconstructed by MLEM in 50 iterations.

25 of 69

slide-26
SLIDE 26

Hermite-Birkhoff interpolation

Let Λ = {λ1, . . . , λn} be a set of linearly independent linear functionals and fΛ = (λ1(f), . . . , λn(f))T ∈ Rn. The solution of a general H-B reconstruction problem:

H-B reconstruction problem

find g such that gΛ = fΛ or λk(g) = λk(f), k = 1, . . . , n .

26 of 69

slide-27
SLIDE 27

Hermite-Birkhoff interpolation

Let Λ = {λ1, . . . , λn} be a set of linearly independent linear functionals and fΛ = (λ1(f), . . . , λn(f))T ∈ Rn. The solution of a general H-B reconstruction problem:

H-B reconstruction problem

find g such that gΛ = fΛ or λk(g) = λk(f), k = 1, . . . , n . In our setting the functionals are λk := Rkf = Rf(tk, θk), k = 1, . . . , n The interpolation conditions

n

  • j=1

cjλk(gj) = λk(f), k = 1, . . . , n that corresponds to the linear system Ac = b as before.

26 of 69

slide-28
SLIDE 28

Hermite-Birkhoff interpolation

Theorem (Haar-Mairhuber-Curtis)

If Ω ⊂ Rd, d ≥ 2 contains an interior point, there exist no Haar spaces of continuous functions except the 1-dimensional case.

The well-posedness of the interpolation problem is garanteed if we no longer fix in advance the set of basis functions. Thus, the basis gj should depend on the data: gj(x) = λy

j (K(x, y)) [= Ry[K(x, y)](tk, θk)] ,

j = 1, . . . , n with the kernel K such that the matrix A = (λx

j [λy k(K(x, y))])j,k

is not singular ∀ (tj, θj)

27 of 69

slide-29
SLIDE 29

Positive definite radial kernels

We choose a kernel K : R2 × R2 → R continuous Symmetric K(x, y) = K(y, x) Radial K(x, y) = Φǫ(x − y), ǫ > 0 Positive definite (PD)

n

  • k,j=1

cjckλx

j λy kK(x, y) ≥ 0

for all set of linear operators λj and for all c ∈ Rn \ {0}

28 of 69

slide-30
SLIDE 30

Positive definite kernels: examples

Gaussian

Φǫ(x) = e−(ǫx)2, PD ∀ x ∈ R2, ǫ > 0

Inverse multiquadrics

Φǫ(x) =

1

  • 1 + (ǫx)2 , PD ∀ x ∈ R2, ǫ > 0

Askey’s compactly supported (or radial characteristic function)

Φǫ(x) = (1 − ǫx)β

+ =

(1 − ǫx)β x < 1/ǫ x ≥ 1/ǫ

which are PD for any β > 3/2.

29 of 69

slide-31
SLIDE 31

A useful Lemma

Lemma

Let K(x, y) = φ(x − y) with φ ∈ L1(R). Then for any x ∈ R2 the Radon transform RyK(x, y) at (t, θ) ∈ R × [0, π) can be expressed

(RyK(x, y))(t, θ) = (RyK(0, y))(t − xTnθ, θ) .

30 of 69

slide-32
SLIDE 32

A useful Lemma

Lemma

Let K(x, y) = φ(x − y) with φ ∈ L1(R). Then for any x ∈ R2 the Radon transform RyK(x, y) at (t, θ) ∈ R × [0, π) can be expressed

(RyK(x, y))(t, θ) = (RyK(0, y))(t − xTnθ, θ) .

This is the so-called shift invariant property of the Radon transform!

30 of 69

slide-33
SLIDE 33

Problem

Inverse multiquadric kernel K(x, y) = 1

  • 1 + x − y2 .

Applying the previous Lemma we have Ry[K(0, y)](t, θ) =

  • R

1

1 + t2 + s2 ds = +∞

−→ the basis gj and the matrix A are not well defined ←−

31 of 69

slide-34
SLIDE 34

Regularization

Window function

Multiplying the kernel K for a “window function” w such that R[K(x, y)w](t, θ) < ∞ ∀ (x, y), (t, θ). This corresponds to use the linear operator Rw in place of R Rw[f](t, θ) = R[fw](t, θ). We consider w radial: w = w(·)

32 of 69

slide-35
SLIDE 35

Example of window functions

Characteristic function w(x) = χ[−L,L](x), L > 0 Gaussian w(x) = e−ν2x2, ν > 0 Compactly supported (Askey’s family) w(x) = (1 − ν2x2)+, ν > 0

33 of 69

slide-36
SLIDE 36

Example: gaussian kernel

Gaussian kernel, shape parameter ε K(x, y) = e−ε2x−y2, ε > 0 Basis function gj(x) = Ry[K(x, y)](tj, θj) =

√π ε e−ε2(tj−xTvj)2

with vj = (cos θj, sin θj) Matrix A = (ak,j) ak,j = R[gj](tk, θk) = +∞, if θj = θk

34 of 69

slide-37
SLIDE 37

Example: gaussian kernel

Gaussian window function w(x) = e−ν2x2, ν > 0 Matrix A becomes ak,j = R[gj w](tk, θk) =

π exp [−ν2(t2

k + ε2b2 ε2a2+ν2 )]

ε √ ε2a2 + ν2

where a = sin (θk − θj) and b = tj − tk cos (θk − θj) which is never vanishing!

35 of 69

slide-38
SLIDE 38

Example: gaussian kernel reconstruction

Figure: Crescent-shaped phantom. Figure: 256 × 256 = 65536 reconstructed image with n = 4050 samples.

36 of 69

slide-39
SLIDE 39

A numerical experiment

Gaussian kernel Φǫ and gaussian weight wν Comparison with the Fourier-based reconstruction (relying on the FBP) Reconstructions from scattered Radon data and noisy Radon data Root Mean Square Error RMSE = 1 J

  • J
  • i=1

(fi − gi)2 J is the dimension of the image, {fi}, {gi} the greyscale values at the pixels of the original and the reconstructed image.

37 of 69

slide-40
SLIDE 40

Kernel-based vs Fourier based: I

⋄ Test phantoms

Figure: crescent shape Figure: bull’s eye Figure: Shepp-Logan

38 of 69

slide-41
SLIDE 41

Geometries

Figure: Left: parallel beam geometry, 170 lines (10 angles and 17 Radon lines per angle). Right: scattered Radon lines, 170 lines.

39 of 69

slide-42
SLIDE 42

Kernel-based vs Fourier based: II

Using parallel beam geometry, i.e. θk = kπ/N, k = 0, . . . , N − 1 and tj = jd, j = −M, . . . , M, with sampling spacing d −→ 0, (2M + 1) × N regular grid of Radon lines. No noise on the data. With N = 45, M = 40, ǫ = 60 we got Phantom

  • ptimal ν

kernel-based Fourier-based crescent 0.5 0.102 0.120 bull’s eye 0.4 0.142 0.134 Table: RMSE of kernel-based vs Fourier-based method

40 of 69

slide-43
SLIDE 43

Kernel-based vs Fourier based: III

Using scattered Radon data, with increasing randomly chosen Radon lines n = 2000, 5000, 10000, 20000. No noise on the data. With ǫ = 50 and ν = 0.7 Phantom 2000 5000 10000 20000 crescent 0.1516 0.1405 0.1431 0.1174 bull’s eye 0.1876 0.1721 0.2102 0.1893 Table: RMSE of kernel-based vs different number n of Radon lines

41 of 69

slide-44
SLIDE 44

Kernel-based vs Fourier based: IV

These experiments are with noisy Radon data, i.e. we add a gaussian noise of zero mean and variance σ = 1.e − 3 to each of the three phantoms. Parallel beam geometry, same ǫ and ν Phantom kernel-based Fourier-based crescent 0.1502 0.1933 bull’s eye 0.1796 0.2322 Table: RMSE of kernel-based vs Fourier-based with noisy data Scattered Radon data, same ǫ and ν Phantom noisy noisy-free crescent 0.2876 0.1820 bull’s eye 0.3140 0.2453 Table: RMSE with noisy and noisy-free data

42 of 69

slide-45
SLIDE 45

Window function parameter

Gaussian kernel; Gaussian window function K(x, y) = e−ε2x−y2 w(x) = e−ν2x2

(a) RMSE (b) k −1(A)

Figure: Bull’s eye phantom, ε = 30.

Trade-off principle (Schaback 1995)

43 of 69

slide-46
SLIDE 46

Kernel shape parameter

Multiquadric kernel, Gaussian window K(x, y) =

  • 1 + ρ2x − y2 e−ε2x−y2

(a) Crescent-shaped phantom (b) Shepp-Logan phantom

Figure: Optimal values depend on the data.

Optimal values depend on data.

44 of 69

slide-47
SLIDE 47

Comparison with FBP Formula

Figure: FBP and Gaussian kernel reconstruction (with

  • ptimal parameters ε∗, ν∗).

(a) (b)

Figure: Crescent-shaped: (a) FBP; (b) Gaussian kernel.

45 of 69

slide-48
SLIDE 48

Comparison with FBP Formula

* RMSE of the same order (ok!) * More computational time and memory usage (not so well!)

(a) FBP (b) Multiquadric kernel

Figure: Computational time.

46 of 69

slide-49
SLIDE 49

Part II Double weighted kernel-method

Work with A. Iske and G. Santin

47 of 69

slide-50
SLIDE 50

Outline

4

Anisotropic kernels Anisotropic basis funtions Reconstruction matrix entries

5

Netwon Bases

48 of 69

slide-51
SLIDE 51

Isotropic and anisotropic kernels

Isotropic (radially symmetric) kernel K(x, y) = ϕ(x − y2),

(x, y) ∈ R2

Anisotropic (symmetric) kernel K(x, y) = ϕ(x − y2)w(x2)w(y2),

(x, y) ∈ R2

(1) where w : [0, ∞) → [0, ∞) suitable weight function

49 of 69

slide-52
SLIDE 52

Well definitness of the basis functions gj

Consider the Schwartz space (cf. Iske 94) S := {γ ∈ C∞(Rd; R) : Dpγ(x)xq → 0, ∀ p, q ∈ Nd

0}

i.e. the set of rapidly decaying C∞ functions.

Definition

A continuous and symmetric function K : Rd × Rd −→ R is said to be positive definite on S, K ∈ PD(S) iff

  • Rd
  • Rd K(x, y)γ(x)γ(y)dxdy > 0

for all γ ∈ S \ {0}.

50 of 69

slide-53
SLIDE 53

Construction of the anisotropic basis: I

For the weighted kernels with K anisotropic, the basis functions are gt,θ(x) = Ry

t,θ

  • ϕ(x − y2)w(y2)
  • w(x2)

(t, θ) ∈ R × [0, π) where Rt,θ is the Radon transform on the line ℓ = ℓt,θ. Simplyfing notation g(x) = ht,θ(x)w(x) where ht,θ(x) = Ry

t,θ

  • ϕ(x − y2)w(y2)
  • =
  • ℓt,θ

ϕ(x − y2)w(y2)dy

51 of 69

slide-54
SLIDE 54

Construction of the anisotropic basis: II

Introducing the rotation matrix Qθ =

  • cos(θ)

− sin(θ) sin(θ) cos(θ)

  • = [nθ, n⊥

θ ]

and letting xθ = Q−1

θ x = QT θ x = [xTnθ, xTn⊥ θ ] ∈ R2 we get

ht,θ(x) =

  • ℓt,θ

ϕ(x − y2)w(y2)dy = =

  • ℓt,0

ϕ(x − Qθy2)w(Qθy2)dy =

  • ℓt,0

ϕ(Q−1

θ x − y2)w(y2)dy =

=

  • ℓt,0

ϕ(xθ − y2)w(y2)dy

52 of 69

slide-55
SLIDE 55

Construction of the anisotropic basis: III

Any y ∈ ℓt,0 has the form y = [t, s]T ∈ R2 for a parameter s ∈ R. Setting vt,s = [t, s]T = y we have ht,θ(x) =

  • R

ϕ((xTnθ − t)2 + (xTn⊥

θ − s)2)w(vt,s2)ds

53 of 69

slide-56
SLIDE 56

Construction of the anisotropic basis: III

Any y ∈ ℓt,0 has the form y = [t, s]T ∈ R2 for a parameter s ∈ R. Setting vt,s = [t, s]T = y we have ht,θ(x) =

  • R

ϕ((xTnθ − t)2 + (xTn⊥

θ − s)2)w(vt,s2)ds

Proposition

For any anisotropic kernel K of the our form, the basis functions gt,θ have the form gt,θ(x) =

  • R

ϕ((xTnθ − t)2 + (xTn⊥

θ − s)2)w(vt,s2)ds

  • w(x2) .

(2) Hence for (ϕ w)(| · |)2 ∈ L1(R) the functions gt,θ : R2 → [0, ∞) are well-defined.

53 of 69

slide-57
SLIDE 57

Reconstruction matrix entries: I

The reconstruction problem RL(g) = RL(f) amounts to solving a linear system Ac = b with matrix entries at,θ

r,φ := Rx r,φ[gt,θ(x)] = Rx t,φ

  • Ry

t,θ

  • ϕ(x − y2)w(y2)
  • w(x2)
  • .

54 of 69

slide-58
SLIDE 58

Reconstruction matrix entries: II

By using the representation of the basis functions gt,θ (omitting the algebra) we get

Proposition

For (t, θ), (r, φ) ∈ R × [0, π) we have at,θ

r,φ =

  • R
  • R

ϕ(Qφvr,˜

s − Qθvt,s2)w(vt,s2)w(vt,˜ s2)ds d˜

s Again, if (ϕw)(| · |2) ∈ L1(R) then the entries are well-defined and the reconstruction matrix A =

  • a

tj,θj rk ,φk

  • 1≤j,k≤n ∈ Rn×n

is symmetric positive definite. In particular the diagonal entries are at,θ

t,θ =

  • R
  • R

ϕ((˜ s − s)2)w(s2 + t2)w(˜ s2 + t2)ds d˜ s .

55 of 69

slide-59
SLIDE 59

Example: gaussian kernel

In the case of the Gaussian kernel ϕ(x − y2) = e−ε2x−y2 and weight w(x) = exp(−ν2x2) we get

gt,θ(x) = √π

  • ε2 + ν2 exp
  • −(ε2 + ν2)(t2 + x2) + 2ε2tnθ · x +

ε4 ε2 + ν2 (n⊥

θ · x)2

  • Matrix entries [DeMIS15]

For (t, θ), (r, φ) ∈ R × [0, π) the entries of the gaussian kernel matrix are the Radon transform of the gaussian basis gt,θ w.r.t. the line ℓr,φ, that is aj,k = Rr,φ[gt,θ] aj,k

  • = at,θ

r,φ

  • =

π √ 2

  • hε,ν(θ, φ)

exp[Φε,ν(r, t, θ, φ)] Φε,ν(r, t, θ, φ) = −2ν2(2ε2 + ν2) (ε2 + ν2)(r2 + t2) − 2ε2rt cos(θ − φ) hε,ν(θ, φ)

  • hε,ν(θ, φ)

= 2(ε2 + ν2)2 − 2ε4 cos2(θ − φ).

56 of 69

slide-60
SLIDE 60

Example: gaussian kernel

In the case of the Gaussian kernel ϕ(x − y2) = e−ε2x−y2 and weight w(x) = exp(−ν2x2) we get

gt,θ(x) = √π

  • ε2 + ν2 exp
  • −(ε2 + ν2)(t2 + x2) + 2ε2tnθ · x +

ε4 ε2 + ν2 (n⊥

θ · x)2

  • Matrix entries [DeMIS15]

For (t, θ), (r, φ) ∈ R × [0, π) the entries of the gaussian kernel matrix are the Radon transform of the gaussian basis gt,θ w.r.t. the line ℓr,φ, that is aj,k = Rr,φ[gt,θ] aj,k

  • = at,θ

r,φ

  • =

π √ 2

  • hε,ν(θ, φ)

exp[Φε,ν(r, t, θ, φ)] Φε,ν(r, t, θ, φ) = −2ν2(2ε2 + ν2) (ε2 + ν2)(r2 + t2) − 2ε2rt cos(θ − φ) hε,ν(θ, φ)

  • hε,ν(θ, φ)

= 2(ε2 + ν2)2 − 2ε4 cos2(θ − φ). −→ The matrix results symmetric and positive definite ←−

56 of 69

slide-61
SLIDE 61

Example: bull-eye phantom

Figure: Bull-eye phantom, 64 × 64. Left: original. Right: Approximed with parallel beam, ǫ = 26 and ν = 0.3333. RMSE=1.12e-1, PSNR=67.2

57 of 69

slide-62
SLIDE 62

Example: Shepp-Logan phantom

Figure: Shepp-Logan phantom, 64 × 64. Left: original. Right: Approximed with parallel beam, ǫ = 26 and ν = 0.3333. RMSE=2.2e-1, PSNR=61.2

58 of 69

slide-63
SLIDE 63

Outline

4

Anisotropic kernels Anisotropic basis funtions Reconstruction matrix entries

5

Netwon Bases

59 of 69

slide-64
SLIDE 64

Newton Bases [MS JAT2011, PS JCAM2011]

Theorem (Basis factorization)

Any data-dependent basis U arises from a factorization A = VU · C−1

U

where VU = (uj(xi))1≤i,j≤n and U(x) = (u1(x), · · · , un(x)) ∈ Rn is a data-dependend basis; the coefficient matrix CU is s.t. U(x) = T(x) · CU where T(x) = (K(x, x1), · · · , K(x, xn)).

Observation

The matrix AKww,R is symmetric and positive definite, A = L · Lt (Cholesky decomposition). The Cholesky decomposition leads to the Newton basis, say N(x) N(x) = T(x) · CN = T(x) · (VN)−t . (3)

60 of 69

slide-65
SLIDE 65

Newton Bases

Observation

The Cholesky algorithm is recursive so we can construct the Newton Basis recursively [PS 2011]. Newton bases allow to:

Properties

Select the reconstruction lines; Solve a smaller system; Thanks to the selection of line-points we have a good compression of data.

61 of 69

slide-66
SLIDE 66

How many Newton Bases ?

500 1000 1500 0.05 0.1 0.15 0.2 0.25 0.3 0.35 0.4 0.45 RMSE as a function of Newton basis Number of Newton Basis RMSE 500 1000 1500 20 40 60 80 100 120 140 160 Computation Time as a function of Newton basis Number of Newton Basis Computation time

Number of (Newton bases) selected lines and computation time for Bull’s Eye phantom reconstruction

62 of 69

slide-67
SLIDE 67

Selection Point

−20 20 40 60 80 100 120 140 20 40 60 80 100 120

500 points are selected for the reconstruction of the Crescent Shape phantom sinogram instead of 1500

63 of 69

slide-68
SLIDE 68

Line selection

−2.5 −2 −1.5 −1 −0.5 0.5 1 1.5 2 2.5 −2.5 −2 −1.5 −1 −0.5 0.5 1 1.5 2 2.5 Selected lines 20 40 60 80 100 120 20 40 60 80 100 120

200 lines selected for the reconstruction of the Crescent Shape phantom: the phantom reconstructed with 200 Newton bases.

64 of 69

slide-69
SLIDE 69

Double-weighted kernel methods vs ART methods

Figure: We compare a reconstruction with ART (first image) and the reconstruction with double-weighted kernel methods with 1500 (full data) Newton Bases (second image)

65 of 69

slide-70
SLIDE 70

Double-weighted kernel methods vs ART methods

Figure: Data with noise: we compare a reconstruction with ART (first image) and the reconstruction with double-weighted kernel methods with 1500 Newton Bases (second image)

66 of 69

slide-71
SLIDE 71

Double-weighted kernel methods vs ART methods

Figure: Missing Data: we compare a reconstruction with ART and the reconstruction with double-weighted kernel methods with 1500 Newton Bases (second image); Missing data: 40 % of Radon data

67 of 69

slide-72
SLIDE 72

Summary and future work

Done

1 Filtered Back-Projection Formula

Efficiency

2 Kernel based reconstruction

Flexibility: double window function Arbitrary scattered Radon data

To be done

1 More on the error analysis 2 Conditionally positive definite kernels 3 Efficiency 68 of 69

slide-73
SLIDE 73

Thank you for your attention!

69 of 69