Bayesian fusion of multi-band images Beyond pansharpening Nicolas - - PowerPoint PPT Presentation

bayesian fusion of multi band images
SMART_READER_LITE
LIVE PREVIEW

Bayesian fusion of multi-band images Beyond pansharpening Nicolas - - PowerPoint PPT Presentation

Bayesian fusion of multi-band images Bayesian fusion of multi-band images Beyond pansharpening Nicolas Dobigeon Joint work with Qi Wei, Jean-Yves Tourneret and Jose M. Bioucas-Dias University of Toulouse, IRIT/INP-ENSEEIHT & T eSA


slide-1
SLIDE 1

Bayesian fusion of multi-band images

Bayesian fusion of multi-band images

Beyond pansharpening Nicolas Dobigeon

Joint work with Qi Wei, Jean-Yves Tourneret and Jose M. Bioucas-Dias University of Toulouse, IRIT/INP-ENSEEIHT & T´ eSA http://www.enseeiht.fr/˜dobigeon

Winter School “Search for Latent Variables: ICA, Tensors, and NMF” Villard de Lans, February 2-4 2015

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 1 / 67

slide-2
SLIDE 2

Bayesian fusion of multi-band images Context

Multi-band imaging Multi/hyper-spectral images

◮ same scene observed at different wavelengths

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 2 / 67

slide-3
SLIDE 3

Bayesian fusion of multi-band images Context

Multi-band imaging Multi/hyper-spectral images

◮ same scene observed at different wavelengths

Hyperspectral Cube

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 2 / 67

slide-4
SLIDE 4

Bayesian fusion of multi-band images Context

Multi-band imaging Multi/hyper-spectral images

◮ same scene observed at different wavelengths, ◮ pixel represented by a vector of tens/hundreds of measurements.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 3 / 67

slide-5
SLIDE 5

Bayesian fusion of multi-band images Context

Multi-band imaging Multi/hyper-spectral images

◮ same scene observed at different wavelengths, ◮ pixel represented by a vector of tens/hundreds of measurements.

Hyperspectral Cube

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 3 / 67

slide-6
SLIDE 6

Bayesian fusion of multi-band images Context

Multi-band image enhancement Overcome the spatial vs. spectral resolution trade-off Panchromatic images (PAN)

◮ no spectral resolution (only 1 band), ◮ very high spatial resolution (∼ 10cm).

Multispectral images (MS)

◮ low spectral resolution (∼ 10 bands), ◮ high spatial resolution (∼ 1m).

Hyperspectral images (HS)

◮ high spectral resolution (∼ 100 bands), ◮ low spatial resolution (∼ 10m).

Objective of the fusion process: get the best of both resolutions.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 4 / 67

slide-7
SLIDE 7

Bayesian fusion of multi-band images Context

Multi-band image enhancement Overcome the spatial vs. spectral resolution trade-off Panchromatic images (PAN)

◮ no spectral resolution (only 1 band), ◮ very high spatial resolution (∼ 10cm).

Multispectral images (MS)

◮ low spectral resolution (∼ 10 bands), ◮ high spatial resolution (∼ 1m).

Hyperspectral images (HS)

◮ high spectral resolution (∼ 100 bands), ◮ low spatial resolution (∼ 10m).

Objective of the fusion process: get the best of both resolutions.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 4 / 67

slide-8
SLIDE 8

Bayesian fusion of multi-band images Context

Multi-band image enhancement Pansharpening: PAN+MS fusion

◮ incorporate the spatial details of the PAN image into the MS image ◮ huge literature ◮ main approaches rely on band substitution

Hyperspectral pansharpening: PAN+HS fusion

◮ incorporate the spatial details of the PAN image into the HS image ◮ more difficult due to the size of the HS image ◮ specific methods should be developed

Multi-band image fusion: MS+HS fusion

◮ incorporate the spatial details of the MS image into the HS image ◮ more difficult since the spatial details contained in a multi-band image ◮ specific methods should be developed

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 5 / 67

slide-9
SLIDE 9

Bayesian fusion of multi-band images Context

Multi-band image enhancement Pansharpening: PAN+MS fusion

◮ incorporate the spatial details of the PAN image into the MS image ◮ huge literature ◮ main approaches rely on band substitution

Hyperspectral pansharpening: PAN+HS fusion

◮ incorporate the spatial details of the PAN image into the HS image ◮ more difficult due to the size of the HS image ◮ specific methods should be developed

Multi-band image fusion: MS+HS fusion

◮ incorporate the spatial details of the MS image into the HS image ◮ more difficult since the spatial details contained in a multi-band image ◮ specific methods should be developed

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 5 / 67

slide-10
SLIDE 10

Bayesian fusion of multi-band images Context

Multi-band image enhancement Pansharpening: PAN+MS fusion

◮ incorporate the spatial details of the PAN image into the MS image ◮ huge literature ◮ main approaches rely on band substitution

Hyperspectral pansharpening: PAN+HS fusion

◮ incorporate the spatial details of the PAN image into the HS image ◮ more difficult due to the size of the HS image ◮ specific methods should be developed

Multi-band image fusion: MS+HS fusion

◮ incorporate the spatial details of the MS image into the HS image ◮ more difficult since the spatial details contained in a multi-band image ◮ specific methods should be developed

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 5 / 67

slide-11
SLIDE 11

Bayesian fusion of multi-band images Context

Multi-band image enhancement Hyperspectral pansharpening: PAN+HS fusion

◮ incorporate the spatial details of the PAN image into the HS image ◮ more difficult due to the size of the HS image ◮ specific methods should be developed

Multi-band image fusion: MS+XS fusion

◮ incorporate the spatial details of the MS image into the HS image ◮ more difficult since the spatial details contained in a multi-band image ◮ specific methods should be developed

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 6 / 67

slide-12
SLIDE 12

Bayesian fusion of multi-band images Context

Problem statement Figure : (a) Hyperspectral Image (size: 99 × 46 × 224, res.: 20m × 20m) (b) Multispectral Image

(size: 396 × 184 × 4 res.: 5m × 5m) (c) Target (size: 396 × 184 × 224 res.: 5m × 5m)

Name AVIRIS (HS) SPOT-5 (MS) Pleiades (MS) WorldView-3 (MS)

  • Res. (m)

20 10 2 1.24 # bands 224 4 4 8 Table : Some existing remote sensors characteristics

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 7 / 67

slide-13
SLIDE 13

Bayesian fusion of multi-band images Context

Forward model YH = XBS + NH, YM = RX + NM

◮ X ∈ Rmλ×n: full resolution unknown image ◮ YH ∈ Rmλ×m and YM ∈ Rnλ×n: observed HS and MS images ◮ B ∈ Rn×n: cyclic convolution operator acting on the bands ◮ S ∈ Rn×m: downsampling matrix ◮ R ∈ Rnλ×mλ: spectral response of the MS sensor ◮ NH ∈ Rmλ×m and NM ∈ Rnλ×n: HS and MS noises

(a) Spatial blur- ring B

10 20 30 40 50 60 70 80 90 0.2 0.4 0.6 0.8 1

Band F2

(b) Spectral blurring R

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 8 / 67

slide-14
SLIDE 14

Bayesian fusion of multi-band images Context

Noise statistics Gaussian assumption NH|s2

H ∼ MN mλ,m(0mλ,m, diag

  • s2

H

  • , Im)

NM|s2

M ∼ MN nλ,n(0nλ,n, diag

  • s2

M

  • , In)

where

◮ s2

H =

  • s2

H,1, . . . , s2 H,mλ

T (hyperspectral noise variances)

◮ s2

M =

  • s2

M,1, . . . , s2 M,nλ

T (multispectral noise variances) and the pdf of a matrix normal distribution is defined by p(X|M, Σr, Σc) = exp

  • − 1

2 tr

  • Σ−1

c (X − M)TΣ−1 r

(X − M)

  • (2π)np/2|Σc|n/2|Σr|p/2

→ band-dependent noise → pixel-independent noise

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 9 / 67

slide-15
SLIDE 15

Bayesian fusion of multi-band images Context

Likelihood of the observations Given the forward model (characterized by both left- and right-operators) YH = XBS + NH YM = RX + NM the two likelihood functions express as YH|X, s2

H ∼ MN mλ,m(XBS, diag

  • s2

H

  • , Im)

YM|X, s2

M ∼ MN nλ,n(RX, diag

  • s2

M

  • , In)

Joint likelihood HS and MS images acquired by distinct sensors → independent HS and MS noises → independent observed images, cond. on X f

  • YH, YM|X, s2

= f

  • YH|X, s2

H

  • f
  • YM|X, s2

M

  • with s2 =
  • s2

H, s2 M

  • Nicolas Dobigeon

Winter School “Search for Latent Variables”, Feb. 2-4 2015 10 / 67

slide-16
SLIDE 16

Bayesian fusion of multi-band images Context

Likelihood of the observations Given the forward model (characterized by both left- and right-operators) YH = XBS + NH YM = RX + NM the two likelihood functions express as YH|X, s2

H ∼ MN mλ,m(XBS, diag

  • s2

H

  • , Im)

YM|X, s2

M ∼ MN nλ,n(RX, diag

  • s2

M

  • , In)

Joint likelihood HS and MS images acquired by distinct sensors → independent HS and MS noises → independent observed images, cond. on X f

  • YH, YM|X, s2

= f

  • YH|X, s2

H

  • f
  • YM|X, s2

M

  • with s2 =
  • s2

H, s2 M

  • Nicolas Dobigeon

Winter School “Search for Latent Variables”, Feb. 2-4 2015 10 / 67

slide-17
SLIDE 17

Bayesian fusion of multi-band images Context

Multi-band image fusion as an estimation problem Maximum likelihood estimation/Weighted least-square regression ˆ X ∈ argmin

X

D2

s2

M (YM|RX) + D2

s2

H (YH|XBS)

where Ds2

M (·|·) and Ds2 H (·|·) are Mahalanobis distances associated with the

noise variances. Main issues

◮ (generally) ill-posed problem ◮ (generally) large scale problem

⇒ Regularization required...

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 11 / 67

slide-18
SLIDE 18

Bayesian fusion of multi-band images Context

Multi-band image fusion as an estimation problem Maximum likelihood estimation/Weighted least-square regression ˆ X ∈ argmin

X

D2

s2

M (YM|RX) + D2

s2

H (YH|XBS)

where Ds2

M (·|·) and Ds2 H (·|·) are Mahalanobis distances associated with the

noise variances. Main issues

◮ (generally) ill-posed problem ◮ (generally) large scale problem

⇒ Regularization required...

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 11 / 67

slide-19
SLIDE 19

Bayesian fusion of multi-band images Context

Low-rank representation Hyperspectral pixels live in a (much) lower-dimensional subspace1... Projection of the data X in a lower-dimensional subspace (R

mλ):

X = VU, where VT is an mλ × mλ projection matrix (estimated or known a priori).

  • 1J. M. Bioucas-Dias et al., “Hyperspectral subspace identification,” IEEE Trans.
  • Geosci. and Remote Sens., vol. 46, no. 8, pp. 2435–2445, 2008.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 12 / 67

slide-20
SLIDE 20

Bayesian fusion of multi-band images Context

Low-rank representation Hyperspectral pixels live in a (much) lower-dimensional subspace1... Projection of the data X in a lower-dimensional subspace (R

mλ):

X = VU, where VT is an mλ × mλ projection matrix (estimated or known a priori).

  • 1J. M. Bioucas-Dias et al., “Hyperspectral subspace identification,” IEEE Trans.
  • Geosci. and Remote Sens., vol. 46, no. 8, pp. 2435–2445, 2008.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 12 / 67

slide-21
SLIDE 21

Bayesian fusion of multi-band images Context

Bayesian framework as a convenient way of regularization Bayesian paradigm Key-quantity: f (U|YH, YM) ∝ f (YH, YM|U) f (U) with → Joint likelihood: f (YH, YM|U) (data-fitting term) → Prior: f (U) (probabilistic formulation of the regularization) Computing the Bayesian estimators of U ˆ UMAP = argmax

U

f (U|YH, YM) = argmax

U

f (YH, YM|U) f (U) and/or ˆ UMMSE = E [U|YH, YM] =

  • Uf (U|YH, YM) dU

  • Uf (YH, YM|U) f (U)dU

and set ˆ X = Vˆ U

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 13 / 67

slide-22
SLIDE 22

Bayesian fusion of multi-band images Context

Bayesian framework as a convenient way of regularization Bayesian paradigm Key-quantity: f (U|YH, YM) ∝ f (YH, YM|U) f (U) with → Joint likelihood: f (YH, YM|U) (data-fitting term) → Prior: f (U) (probabilistic formulation of the regularization) Computing the Bayesian estimators of U ˆ UMAP = argmax

U

f (U|YH, YM) = argmax

U

f (YH, YM|U) f (U) and/or ˆ UMMSE = E [U|YH, YM] =

  • Uf (U|YH, YM) dU

  • Uf (YH, YM|U) f (U)dU

and set ˆ X = Vˆ U

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 13 / 67

slide-23
SLIDE 23

Bayesian fusion of multi-band images Context

Outline Context Gaussian prior modeling... Hierarchical Bayesian model Hybrid Gibbs Sampler Simulation Results Robustness with respect to R ... with unknown spectral response function Complementary prior modeling Hybrid Gibbs sampler Simulation Results What about MAP estimation? Block coordinated descent algorithm Simulation Results Dictionary-based sparse prior modeling Variational formulation Alternate Optimization Scheme Simulations Results Conclusion

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 14 / 67

slide-24
SLIDE 24

Bayesian fusion of multi-band images Gaussian prior modeling...

Outline Context Gaussian prior modeling... Hierarchical Bayesian model

Parameter priors Hyperparameter priors Posterior distribution

Hybrid Gibbs Sampler Simulation Results

Fusion results on AVIRIS dataset

Robustness with respect to R ... with unknown spectral response function What about MAP estimation? Dictionary-based sparse prior modeling Conclusion

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 15 / 67

slide-25
SLIDE 25

Bayesian fusion of multi-band images Gaussian prior modeling...

Parameter priors Unknown parameter vector: θ =

  • U, s2

H, s2 M

  • ◮ Image projected in the lower dimensional subspace: conjugate matrix

Gaussian priors U|µu, Σ(i)

u ∼ MNmλ,n (µu, Σu, In)

with

◮ µu: spline-interpolated HS image projected onto the subspace ◮ Σu: between-band covariance matrix (correlated bands) ◮ In: between-pixel covariance matrix (independent pixels)

◮ Noise variances: independent conjugate inverse-gamma priors

s2

H,ℓ & s2 M,ℓ|ν, γ ∼ IG

ν 2, γ 2

  • Flexible distribution whose shape can be adjusted from (ν, γ)

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 16 / 67

slide-26
SLIDE 26

Bayesian fusion of multi-band images Gaussian prior modeling...

Parameter priors Unknown parameter vector: θ =

  • U, s2

H, s2 M

  • ◮ Image projected in the lower dimensional subspace: conjugate matrix

Gaussian priors U|µu, Σ(i)

u ∼ MNmλ,n (µu, Σu, In)

with

◮ µu: spline-interpolated HS image projected onto the subspace ◮ Σu: between-band covariance matrix (correlated bands) ◮ In: between-pixel covariance matrix (independent pixels)

◮ Noise variances: independent conjugate inverse-gamma priors

s2

H,ℓ & s2 M,ℓ|ν, γ ∼ IG

ν 2, γ 2

  • Flexible distribution whose shape can be adjusted from (ν, γ)

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 16 / 67

slide-27
SLIDE 27

Bayesian fusion of multi-band images Gaussian prior modeling...

Hyperparameter estimation Unknown hyperparameter vector: Φ = {Σu, γ} Frequentist approach E.g., resorting to an expectation-maximization algorithm. Full Bayesian approach Introducing a second level in the Bayesian inference hierarchy. Hierarchical Bayesian inference f (θ, Φ|YH, YM) ∝ f (YH, YM|θ) f (θ|Φ) f (Φ)

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 17 / 67

slide-28
SLIDE 28

Bayesian fusion of multi-band images Gaussian prior modeling...

Hyperparameter estimation Unknown hyperparameter vector: Φ = {Σu, γ} Frequentist approach E.g., resorting to an expectation-maximization algorithm. Full Bayesian approach Introducing a second level in the Bayesian inference hierarchy. Hierarchical Bayesian inference f (θ, Φ|YH, YM) ∝ f (YH, YM|θ) f (θ|Φ) f (Φ)

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 17 / 67

slide-29
SLIDE 29

Bayesian fusion of multi-band images Gaussian prior modeling...

Hyperparameter prior Unknown hyperparameter vector: Φ = {Σu, γ}

◮ Hyperparameter Σu : Inverse-Wishart (IW) distribution

Σu ∼ W−1(Ψ, η) whereΨ and η are fixed to provide a non-informative prior

◮ Hyperparameter γ: Jeffreys’ non-informative prior

f (γ) ∝ 1 γ 1R+ (γ)

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 18 / 67

slide-30
SLIDE 30

Bayesian fusion of multi-band images Gaussian prior modeling...

Joint posterior Using Bayes theorem, the joint posterior distribution is f (θ, Φ|YH, YM) ∝ f (YH, YM|θ) f (θ|Φ) f (Φ) where

◮ unknown parameters: θ =

  • U, s2

◮ unknown hyperparameters: Φ = {Σu, γ}

How can we estimate θ (and Φ)?

◮ Marginalize the hyperparameter γ ◮ Sample according to the joint posterior f

  • U, s2, Σu|YH, YM
  • by using a

Markov chain Monte Carlo (MCMC) algorithm.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 19 / 67

slide-31
SLIDE 31

Bayesian fusion of multi-band images Gaussian prior modeling...

Bayesian sample-based estimation Gibbs sampling principle Let f (ϑ1, . . . , ϑN|y) denote a posterior distribution to be sampled.

◮ iterative sampling according to the associated conditional distributions:

ϑ1 ∼ f (ϑ1|ϑ2, . . . , ϑN, y) ϑ2 ∼ f (ϑ2|ϑ1, ϑ3, . . . , ϑN, y) . . . ϑN ∼ f (ϑp|ϑ1, . . . , ϑN−1, y) Properties

◮ Set of T samples

  • ϑ(t)

1 , . . . , ϑ(t) N

T

t=1 asymptotically distributed

according to the joint distribution f (ϑ1, . . . , ϑN|y)

◮ Set of T samples

  • ϑ(t)

j

T

t=1 asymptotically distributed according to the

marginal distribution f (ϑj|y)

◮ Thus, MMSE estimators approximated by (law of large numbers)

ˆ ϑj,MMSE = E [ϑj|y] ≈ 1 T

T

  • t=1

ϑ(t)

j

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 20 / 67

slide-32
SLIDE 32

Bayesian fusion of multi-band images Gaussian prior modeling...

Bayesian sample-based estimation Gibbs sampling principle Let f (ϑ1, . . . , ϑN|y) denote a posterior distribution to be sampled.

◮ iterative sampling according to the associated conditional distributions:

ϑ1 ∼ f (ϑ1|ϑ2, . . . , ϑN, y) ϑ2 ∼ f (ϑ2|ϑ1, ϑ3, . . . , ϑN, y) . . . ϑN ∼ f (ϑp|ϑ1, . . . , ϑN−1, y) Properties

◮ Set of T samples

  • ϑ(t)

1 , . . . , ϑ(t) N

T

t=1 asymptotically distributed

according to the joint distribution f (ϑ1, . . . , ϑN|y)

◮ Set of T samples

  • ϑ(t)

j

T

t=1 asymptotically distributed according to the

marginal distribution f (ϑj|y)

◮ Thus, MMSE estimators approximated by (law of large numbers)

ˆ ϑj,MMSE = E [ϑj|y] ≈ 1 T

T

  • t=1

ϑ(t)

j

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 20 / 67

slide-33
SLIDE 33

Bayesian fusion of multi-band images Gaussian prior modeling...

Gibbs sampler for multi-band fusion Initialization: U(0) and s2(0), for t = 1 to NMC do

% Sampling the image covariance matrix

Sample ˜ Σ(t)

u from f(Σu|U(t−1), s2(t−1), YH, YM)

% Sampling the multispectral noise variances

for ℓ = 1 to nλ do Sample ˜ s2(t)

M,ℓ from f(s2 M,ℓ|U(t−1), YM),

end for

% Sampling the hyperspectral noise variances

for ℓ = 1 to mλ do Sample ˜ s2(t)

H,ℓ from f(s2 H,ℓ|U(t−1), YH),

end for

% Sampling the high-resolved image

Sample ˜ U(t) from f(U|Σ(t)

u , s2(t), YH, YM)

end for

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 21 / 67

slide-34
SLIDE 34

Bayesian fusion of multi-band images Gaussian prior modeling...

Conditional distributions Covariance matrix of the image Σu Wishart distribution (easy) Σu|u, s2, YH, YM ∼ W−1

  • Ψ +

mxmy

  • i=1

(ui − µ(i)

u )T(ui − µ(i) u ), n + η

  • Noise variance vector s2

Inverse-gamma distributions (easy) s2

H,ℓ|U, YH ∼ IG

  m 2 ,

  • YH − VUBS2

F

2    s2

M,ℓ|U, YH ∼ IG

  n 2,

  • YM − RVU2

F

2   

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 22 / 67

slide-35
SLIDE 35

Bayesian fusion of multi-band images Gaussian prior modeling...

Conditional distributions (Cont.) Highly-resolved image U High-dimensional Gaussian distribution (not easy) − log f(U|Σu, s2, YH, YM) =

1 2s2

H YH − VUBS2

F+ 1 2s2

M YM − RVU2

F + 1 2 n

  • i=1

(ui − µui )TΣ−1

u (ui − µui ) + C

Several strategies:

◮ Gibbs sampling of the associated conditional distributions

→ computational intensive (pixelwise loop curse)

◮ Metropolis-Hasting step with random proposition

→ computational inefficient (high-dimensional space to explore)

◮ Metropolis-Hasting step with relevant proposition

→ Hamiltonian Monte Carlo method

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 23 / 67

slide-36
SLIDE 36

Bayesian fusion of multi-band images Gaussian prior modeling...

Conditional distributions (Cont.) Highly-resolved image U High-dimensional Gaussian distribution (not easy) − log f(U|Σu, s2, YH, YM) =

1 2s2

H YH − VUBS2

F+ 1 2s2

M YM − RVU2

F + 1 2 n

  • i=1

(ui − µui )TΣ−1

u (ui − µui ) + C

Several strategies:

◮ Gibbs sampling of the associated conditional distributions

→ computational intensive (pixelwise loop curse)

◮ Metropolis-Hasting step with random proposition

→ computational inefficient (high-dimensional space to explore)

◮ Metropolis-Hasting step with relevant proposition

→ Hamiltonian Monte Carlo method

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 23 / 67

slide-37
SLIDE 37

Bayesian fusion of multi-band images Gaussian prior modeling...

Hamiltonian Monte Carlo method Metropolis-Hastings algorithm

◮ Candidate generation according to a proposal distribution q (·)

U(⋆) ∼ q (U)

◮ Accept U(t+1) ← U(⋆) with probability w = min (1, ρ)

ρ = f(U(⋆)| · · · ) f(U(t)| · · · ) q(U(t)) q(U(⋆)) Gradient based-proposal U(⋆) = U(t) + ǫ∆U(t) where ∆U(t) derives from the local curvature of the target distribution f (U|·).

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 24 / 67

slide-38
SLIDE 38

Bayesian fusion of multi-band images Gaussian prior modeling...

Qualitative results (AVIRIS dataset)

(a) Ref. (b) HS (c) MS (d) MAP (e) Wav. MAP (f) Proposed

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 25 / 67

slide-39
SLIDE 39

Bayesian fusion of multi-band images Gaussian prior modeling...

Quantitative performance measures (1/2)

◮ RMSE/RSNR (Root Mean Square Error): a similarity measure between

the target image X and the fused image ˆ X RMSE(X, ˆ X) = 1 nmλ X − ˆ X2

F

RSNR(X, ˆ X) = log 1 nmλ X2

F

RMSE The larger RSNR, the better the fusion quality.

◮ SAM (Spectral Angle Mapper): spectral distortion between the actual

and estimated images SAM(xn, ˆ xn) = arccos

  • xn, ˆ

xn xn2ˆ xn2

  • The overall SAM is obtained by averaging the SAMs computed from all

image pixels. The smaller the absolute value of SAM, the less important the spectral distortion.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 26 / 67

slide-40
SLIDE 40

Bayesian fusion of multi-band images Gaussian prior modeling...

Quantitative performance measures (2/2)

◮ UIQI (Universal Image Quality Index): related to the correlation,

luminance distortion and contrast distortion of the estimated image w.r.t. the reference image. The UIQI between two images a and ˆ a is UIQI(a, ˆ a) = 4σ2

aˆ aµaµˆ a

(σ2

a + σ2 ˆ a)(µ2 a + µ2 ˆ a)

where

  • µa, µˆ

a, σ2 a, σ2 ˆ a

  • are the sample means and variances of a and ˆ

a, and σ2

aˆ a is the sample covariance of (a, ˆ

a). The range of UIQI is [−1, 1]. The larger the UIQI, the better the fusion result.

◮ DD (degree of distortion): DD between two images X and ˆ

X is defined as DD(X, ˆ X) = 1 nmλ vec(X) − vec(ˆ X)1. The smaller DD, the better the fusion.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 27 / 67

slide-41
SLIDE 41

Bayesian fusion of multi-band images Gaussian prior modeling...

Quantitative results (AVIRIS dataset) Table : Performance of HS+MS fusion methods in terms of: RSNR (db), UIQI, SAM (deg) and DD(×10−2). Methods RSNR UIQI SAM DD Time(s) MAP 2 23.33 0.9913 5.05 4.87 1.6 Wavelet 3 25.53 0.9956 3.98 3.89 31 Proposed 26.74 0.9966 3.40 3.33 530 Advantages

◮ Samples generated by the proposed method can be used to compute

uncertainties about the estimates (confidence interval)

◮ Generalization to more complex problems (non-Gaussianities,

endmember uncertainty, etc)

◮ Noise variance estimation

2Hardie et al., Application of the Stochastic Mixing Model to Hyperspectral Resolution

Enhancement, IEEE Trans. Image Process., vol. 13, no. 9, pp. 11741184, Sept. 2004.

3Zhang et al., Noise-Resistant Wavelet-Based Bayesian Fusion of Multispectral and

Hyperspectral Images, IEEE Trans. Geosci. and Remote Sens., vol. 47, no. 11, Nov. 2009.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 28 / 67

slide-42
SLIDE 42

Bayesian fusion of multi-band images Gaussian prior modeling...

Noise variance estimation

20 40 60 80 100 120 140 160 10

−4

10

−3

10

−2

HS bands Noise Variances Estimation Actual 1 2 3 4 5 6 7 10

−4

10

−3

10

−2

MS bands Noise Variances Estimation Actual

Figure : Noise variances and their MMSE estimates. (Top) HS image. (Bottom) MS image.

◮ Good estimation performance ◮ Track the variations of the noise variances within tolerable discrepancy

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 29 / 67

slide-43
SLIDE 43

Bayesian fusion of multi-band images Gaussian prior modeling...

Robustness with respect to R FSNR: defined to adjust the knowledge of R FSNR = 10 log10

  • R2

F

mλnλ,2s2

2

  • 20

40 60 80 100 120 140 160 0.5 1

Band R

20 40 60 80 100 120 140 160 0.5 1

Band R + noise

5 10 15 20 25 30 23 23.5 24 24.5 25 25.5 26 26.5 27 FSNR(dB) RSNR(dB) MAP Wavelet HMC

When FSNR is above 8dB, the proposed method outperforms the MAP and wavelet-based MAP methods.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 30 / 67

slide-44
SLIDE 44

Bayesian fusion of multi-band images ... with unknown spectral response function

Outline Context Gaussian prior modeling... ... with unknown spectral response function Complementary prior modeling Hybrid Gibbs sampler Simulation Results

Qualitative fusion results Quantitative fusion results Noise Variance Estimation Pseudo-spectral response estimation

What about MAP estimation? Dictionary-based sparse prior modeling Conclusion

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 31 / 67

slide-45
SLIDE 45

Bayesian fusion of multi-band images ... with unknown spectral response function

How to proceed when R is unknown? YH = VUBS + NH YM = RVU + NM

◮ Estimate R = RV ∈ Rnλ×

mλ instead of R since

◮ The data live in a lower-dimensional subspace ◮ The size of RV ∈ Rnλ×

mλ is much smaller than R ∈ Rnλ×mλ

Remark: the original spectral response R is not easy to be estimated from R since the matrix V is not invertible

◮ Assign a non informative matrix normal prior to R

R|¯ R, σ2

R ∼ MN nλ, mλ(¯

R, σ2

RInλ, I mλ)

◮ The mean response ¯

R may come from prior knowledge

◮ σ2

R is set to a (large) value to ensure a non-informative prior for R

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 32 / 67

slide-46
SLIDE 46

Bayesian fusion of multi-band images ... with unknown spectral response function

Gibbs sampler for multi-band fusion Initialization: U(0), s2(0) and ˜ R

(0)

for t = 1 to NMC do

% Sampling the image covariance matrix

Sample ˜ Σ(t)

u from f(Σu|U(t−1), s2(t−1), ˜

R

(t−1), YH, YM)

% Sampling the multispectral noise variances

for ℓ = 1 to nλ do Sample ˜ s2(t)

M,ℓ from f(s2 M,ℓ|U(t−1), YM),

end for

% Sampling the hyperspectral noise variances

for ℓ = 1 to mλ do Sample ˜ s2(t)

H,ℓ from f(s2 H,ℓ|U(t−1), ˜

R

(t−1), YH),

end for

% Sampling the pseudo spectral response

Sample ˜ R from f(R|U(t−1), s2

M (t), YM)

% Sampling the high-resolved image

Sample ˜ U(t) from f(U|Σ(t)

u , s2(t), ˜

R

(t), YH, YM)

end for

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 33 / 67

slide-47
SLIDE 47

Bayesian fusion of multi-band images ... with unknown spectral response function

Conditional distributions Conditional distribution for R Gaussian distribution (easy) R|U, s2

M, YM ∼ MN nλ, mλ (µR, Inλ, ΣR)

with µR =

  • 1

s2

M YMUT +

1 σ2

R

¯ R

  • ΣR

ΣR =

  • 1

s2

M UUT +

1 σ2

R I

−1

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 34 / 67

slide-48
SLIDE 48

Bayesian fusion of multi-band images ... with unknown spectral response function

Qualitative results (AVIRIS dataset)

(c) Ref. (d) HS (e) MS (f) MAP (g) Wav. MAP (h) Oracle MMSE (i) MMSE

◮ AVIRIS data: 128 × 128 × 176 ◮ Spatial degradation: 5 × 5 Gaussian blurring kernel and

down-sampled every 4 pixels

◮ Spectral degradation: LANDSAT spectral response 7 × 176

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 35 / 67

slide-49
SLIDE 49

Bayesian fusion of multi-band images ... with unknown spectral response function

Quantitative fusion results Table : Fusion Performance: RSNR (in dB), UIQI and SAM (in degree). Methods RSNR UIQI SAM MAP 16.66 0.9336 5.74 Wavelet MAP 19.50 0.9626 4.19 MCMC with known (oracle) R 21.91 0.9771 3.09 MCMC with imperfect R (FSNR=10dB) 21.80 0.9764 3.13 MCMC with unknown R 21.90 0.9769 3.10

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 36 / 67

slide-50
SLIDE 50

Bayesian fusion of multi-band images ... with unknown spectral response function

Noise variance estimation

20 40 60 80 100 120 140 160 180 10

−6

10

−5

10

−4

HS Bands Noise Variances

1 1.5 2 2.5 3 3.5 4 10

−4.7

10

−4.5

10

−4.3

MS bands Noise Variances

MMSE Actual MMSE Actual

Figure : Noise variances estimates. (Top) HS image. (Bottom) MS image.

◮ Good estimation performance ◮ Track the variations of the noise variances within tolerable discrepancy

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 37 / 67

slide-51
SLIDE 51

Bayesian fusion of multi-band images ... with unknown spectral response function

Pseudo-Spectral Response Estimation

−0.02 −0.01 0.01 0.02 0.03 0.04 −0.02 −0.01 0.01 0.02 0.03 0.04

Figure : True pseudo-spectral response R (left) and its estimation (right). Relative error between the estimation and ground-truth: ˆ R − R2 R2 = 0.0314% Remark: the true pseudo-spectral response is obtained by multiplying the spectral response of the LANDSAT satellite by the matrix V

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 38 / 67

slide-52
SLIDE 52

Bayesian fusion of multi-band images What about MAP estimation?

Outline Context Gaussian prior modeling... ... with unknown spectral response function What about MAP estimation? Block coordinated descent algorithm Simulation Results Dictionary-based sparse prior modeling Conclusion

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 39 / 67

slide-53
SLIDE 53

Bayesian fusion of multi-band images What about MAP estimation?

The neg-logarithm of the joint posterior p (θ, Σu|YH, YM) is given as L(U, s2, Σu) = − log p (θ, Σu|YH, YM) = − log p (YH|θ) − log p (YM|θ) −

n

  • l=1

log p (ul|Σu) −

  • i=1

log p

  • s2

H,i

  • j=1

log p

  • s2

M,j

  • − log p (Σu) − C

◮ MAP estimator: minimizing the function L(U, s2, Σu) with respect to U,

s2 and Σu iteratively

◮ use of a Block coordinated descent (BCD) algorithm

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 40 / 67

slide-54
SLIDE 54

Bayesian fusion of multi-band images What about MAP estimation?

Block coordinated descent algorithm Input: YH, YM, mλ, B, S, R,s2

0, Σ0

for t = 1, 2, . . . to stopping rule do Ut = arg min

U L(U, s2 t−1, Σt−1)

s2

t = arg mins2 L(Ut, s2, Σt−1)

Σt = arg minΣ L(Ut, s2

t , Σ)

end Output: ˆ U (Projected high resolution HS image) Remark To be compared with the Gibbs sampling algorithm: instead of simulating according to the conditionals, maximizing!

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 41 / 67

slide-55
SLIDE 55

Bayesian fusion of multi-band images What about MAP estimation?

The optimization w.r.t. to U consists of minimizing LU(U) = 1 2Λ

− 1

2

H

(YH − VUBS) 2

F + 1

− 1

2

M

(YM − RVU) 2

F

+1 2Σ

− 1

2

u

(U − µU) 2

F.

Alternating Direction Method of Multipliers (ADMM) Idea: transform the unconstrained optimization with respect to U into a constrained one via a variable splitting “trick”, and then attack this constrained problem using an augmented Lagrangian (AL) method4.

  • 4M. Afonso et al., “An augmented Lagrangian approach to the constrained
  • ptimization formulation of imaging inverse problems,” IEEE Trans. Image Process.,
  • vol. 20, no. 3, pp. 681–695, 2011.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 42 / 67

slide-56
SLIDE 56

Bayesian fusion of multi-band images What about MAP estimation?

Qualitative results (AVIRIS dataset)

(a) Ref. (b) HS (c) MS (d) MAP (e) Wav. MAP (f) MMSE (g) Proposed

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 43 / 67

slide-57
SLIDE 57

Bayesian fusion of multi-band images What about MAP estimation?

Quantitative results (AVIRIS dataset) Table : Performance of the fusion methods: RMSE (×10−2), UIQI, SAM (◦) and time (second). Methods RMSE UIQI SAM Time Hardie 6.96 0.9932 5.15 3 Zhang 5.68 0.9956 4.22 72 MCMC 5.06 0.9971 3.73 6228 Proposed 5.10 0.9971 3.74 96 Much more computationally efficient than stochastic sampling with comparable performance. Latest news: New strategy implemented: computational time < 1s!

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 44 / 67

slide-58
SLIDE 58

Bayesian fusion of multi-band images What about MAP estimation?

Quantitative results (AVIRIS dataset) Table : Performance of the fusion methods: RMSE (×10−2), UIQI, SAM (◦) and time (second). Methods RMSE UIQI SAM Time Hardie 6.96 0.9932 5.15 3 Zhang 5.68 0.9956 4.22 72 MCMC 5.06 0.9971 3.73 6228 Proposed 5.10 0.9971 3.74 96 Much more computationally efficient than stochastic sampling with comparable performance. Latest news: New strategy implemented: computational time < 1s!

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 44 / 67

slide-59
SLIDE 59

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Outline Context Gaussian prior modeling... ... with unknown spectral response function What about MAP estimation? Dictionary-based sparse prior modeling Variational formulation

Dictionary-based regularization Dictionary Learning and Sparse Coding Re-estimation of the sparse code

Alternate Optimization Scheme

Optimization with respect to U Optimization with respect to A

Simulations Results

Comparison with other fusion methods Performance versus λd

Conclusion

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 45 / 67

slide-60
SLIDE 60

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Penalized inverse problem Based on the linear model and dimensionality reduction, fusing the HS and MS images can be formulated as the following inverse problem: min

U

1 2

  • YH − VUBS
  • 2

F

  • HS data term

∝ln p(YH|U)

+ λm 2

  • YM − RVU
  • 2

F

  • MS data term

∝ln p(YM|U)

+ λdφ(U)

regularizer ∝ln p(U)

,

◮ The first two terms are data fidelity terms for the MS and HS images ◮ The last term is a penalty ensuring appropriate regularization

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 46 / 67

slide-61
SLIDE 61

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Dictionary-based regularization Motivation Self-similarity property of natural image patches image patches

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 47 / 67

slide-62
SLIDE 62

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Dictionary-based regularization Motivation Self-similarity property of natural image patches image patches

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 48 / 67

slide-63
SLIDE 63

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Dictionary-based regularization The patches of the target image U can be sparsely approximated on an

  • ver-complete dictionary (with columns referred to as atoms).

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 49 / 67

slide-64
SLIDE 64

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Dictionary-based regularization ℓ2-norm regularization (Gaussian prior) φ(U) = 1 2

  • U − ¯

U (D, A)

  • 2

F

Separating each band of the target image leads to φ(U) = 1 2

  • i=1
  • Ui − P (DiAi)
  • 2

2

◮ Ui ∈ Rn is the ith band (or row) of U ∈ R

mλ×n

◮ Di ∈ Rnp×nat is the dictionary dedicated to the ith band of U (np is the

patch size and nat is the number of atoms) and D =

  • D1, · · · , D

  • ◮ Ai ∈ Rnat×npat is the ith band’s code (npat is the number of patches

associated with the ith band) and A =

  • A1, · · · , A

  • ◮ P(·) is a linear operator that averages the overlapping patches of each

band to restore the target image

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 50 / 67

slide-65
SLIDE 65

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

How can we obtain the dictionary D and the code A?

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 51 / 67

slide-66
SLIDE 66

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Dictionary learning and sparse coding Dictionary Learning Learn the set of over-complete dictionaries D =

  • D1, · · · , D

  • :

applying a DL algorithm on the rough estimation of U (constructed from the MS and HS images)

◮ K-SVD method ◮ Online Dictionary Learning (ODL) method

Sparse coding

◮ Orthogonal Matching Pursuit (OMP): to estimate the sparse code Ai

(with nmax coefficient) for each band Ui

◮ Support (Ωi ⊂ N2, i = 1, · · · ,

mλ): The positions of the non-zero elements of the code Ai are also identified

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 52 / 67

slide-67
SLIDE 67

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Re-estimation of the sparse code Inspired by Bayesian hierarchical models, we propose to include the code A within the estimation process. φ(U, A) = 1 2

  • i=1
  • Ui − P (DiAi)
  • 2

F + µa

  • Ai
  • where .0 is the ℓ0 counting function (or ℓ0 norm) and µa is a regularization

parameter. By fixing the supports Ωi, the ℓ0 norm reduces to a constant. Hence, φ(U, A) = 1 2

  • i=1
  • Ui − P (DiAi)
  • 2

F s.t. Ai,\Ωi = 0

where Ai,\Ωi = {Ai(l, k) | (l, k) ∈ Ωi}.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 53 / 67

slide-68
SLIDE 68

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Re-estimation of the sparse code Inspired by Bayesian hierarchical models, we propose to include the code A within the estimation process. φ(U, A) = 1 2

  • i=1
  • Ui − P (DiAi)
  • 2

F + µa

  • Ai
  • where .0 is the ℓ0 counting function (or ℓ0 norm) and µa is a regularization

parameter. By fixing the supports Ωi, the ℓ0 norm reduces to a constant. Hence, φ(U, A) = 1 2

  • i=1
  • Ui − P (DiAi)
  • 2

F s.t. Ai,\Ωi = 0

where Ai,\Ωi = {Ai(l, k) | (l, k) ∈ Ωi}.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 53 / 67

slide-69
SLIDE 69

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Final optimization problem Joint optimization with respect to U and A min

U,A L(U, A)

= 1

2

  • YH − VUBS
  • 2

F + λm 2

  • YM − RVU
  • 2

F+ λd 2

  • i=1
  • Ui − P (DiAi)
  • 2

F

  • , s.t. Ai,\Ωi = 0

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 54 / 67

slide-70
SLIDE 70

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Optimization with respect to U min

U L(U)

= 1

2

  • YH − VUBS
  • 2

F + λm 2

  • YM − RVU
  • 2

F+ λd 2

  • i=1
  • Ui − P (DiAi)
  • 2

F

  • ,

Difficulties

◮ Large dimensionality of U ◮ Diagonalization of the linear operators V(·)BS and P(·) not possible

Alternating Direction Method of Multipliers (ADMM) Again...

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 55 / 67

slide-71
SLIDE 71

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Optimization with respect to A Optimization with respect to Ai (i = 1, · · · , mλ) conditioned upon Ui min

Ai

  • Ui − P(DiAi)
  • 2

F s.t. Ai,\Ωi = 0

Remarks

◮ The optimization with respect to Ai considers only the non-zero

elements of Ai, denoted as Ai,Ωi = {Ai(l, k) | (l, k) ∈ Ωi}

◮ Standard least square (LS) problem which can be solved analytically

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 56 / 67

slide-72
SLIDE 72

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Alternate Optimization Scheme Input: YH, YM, B, S, R, SNRH, SNRM, mλ, nmax Output: ˆ X (high resolution HS image)

◮ Approximate ¯

U using YM and YH /* Rough estimation of U*/

◮ ˆ

D ← ODL(¯ U) /* Online dictionary learning */

◮ ˆ

A ← OMP(ˆ D, ¯ U, nmax) /* Sparse coding */

◮ ˆ

Ω ← ˆ A = 0 /* Computing support */

◮ ˆ

V ← PCA(YH, mλ) /* Computing subspace transform matrix */ /* Start alternate optimization */ for t = 1, 2, . . . to stopping rule do ˆ Ut ∈ {U : L(U, ˆ At−1) ≤ L(ˆ Ut−1, ˆ At−1)} /* solved with ADMM */ ˆ At ∈ {A : L(ˆ Ut, A) ≤ L(ˆ Ut, ˆ At−1)} /* solved with LS */ end ˆ X = ˆ Vˆ U

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 57 / 67

slide-73
SLIDE 73

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Comparison with other fusion methods (visual inspection)

(a) Ref. (b) HS (c) MS (d) MAP (e) Wav. MAP (f) CNMF (g) HMC (h) Proposed

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 58 / 67

slide-74
SLIDE 74

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Comparison with other fusion methods (quantitative results) Table : Performance of different MS + HS fusion methods (Pavia dataset): RMSE (in 10−2), UIQI, SAM (in degree), ERGAS, DD (in 10−3) and Time (in second). Methods RMSE UIQI SAM ERGAS DD Time MAP 1.148 0.9875 1.962 1.029 8.666 3 Wavelet MAP 1.099 0.9885 1.849 0.994 8.349 75 CNMF 1.119 0.9857 2.039 1.089 9.007 14 HMC 1.011 0.9903 1.653 0.911 7.598 6003 Proposed 0.947 0.9913 1.492 0.850 7.010 282 The proposed method provides promising results for the considered quality measures.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 59 / 67

slide-75
SLIDE 75

Bayesian fusion of multi-band images Dictionary-based sparse prior modeling

Performance versus λd

10 20 30 40 50 0.01 0.011 0.012

λ RMSE

MAP Wavelet CNMF HMC Proposed

(i) RMSE

10 20 30 40 50 0.986 0.987 0.988 0.989 0.99 0.991

λ UIQI

(j) UIQI

10 20 30 40 50 1.5 1.6 1.7 1.8 1.9 2

λ SAM

(k) SAM

10 20 30 40 50 7.5 8 8.5 9 x 10

−3

λ DD

(l) DD

Figure : Performance of the proposed fusion algorithm versus .

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 60 / 67

slide-76
SLIDE 76

Bayesian fusion of multi-band images Conclusion

Outline Context Gaussian prior modeling... ... with unknown spectral response function What about MAP estimation? Dictionary-based sparse prior modeling Conclusion

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 61 / 67

slide-77
SLIDE 77

Bayesian fusion of multi-band images Conclusion

Conclusions

◮ fusion of multi band images formulated as a linear inverse problem, that

exploits explicitly the forward model

◮ solved within a (hierarchical) Bayesian framework ◮ a first regularization consists of constrain the estimation in a

lower-dimensional space

◮ two prior modeling have been introduced

◮ Gaussian prior ◮ dictionary-based sparse prior

◮ the noise variances can be estimated jointly with the image to be

recovered.

◮ the spectral response R can be included in the estimation process, and

gives better results

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 62 / 67

slide-78
SLIDE 78

Bayesian fusion of multi-band images Conclusion

References I

  • L. Loncan, J. M. Bioucas-Dias, X. Briottet, J. Chanussot, N. Dobigeon, S. Fabre,
  • W. Liao, G. Licciardi, M. Simoes, J.-Y. Tourneret, M. Veganzones, G. Vivone,
  • Q. Wei, and N. Yokoya, “Introducing hyperspectral pansharpening,” IEEE Geosci.

Remote Sens. Mag., 2015, submitted.

  • R. C. Hardie, K. J. Barnard, and E. E. Armstrong, “Joint MAP registration and

high-resolution image estimation using a sequence of undersampled images,” IEEE Trans. Image Process., vol. 6, no. 12, pp. 1621–1633, Dec. 1997.

  • M. T. Eismann and R. C. Hardie, “Application of the stochastic mixing model to

hyperspectral resolution enhancement,” IEEE Trans. Geosci. and Remote Sens.,

  • vol. 42, no. 9, pp. 1924–1933, Sept. 2004.
  • R. C. Hardie, M. T. Eismann, and G. L. Wilson, “MAP estimation for hyperspectral

image resolution enhancement using an auxiliary sensor,” IEEE Trans. Image Process., vol. 13, no. 9, pp. 1174–1184, Sept. 2004.

  • Y. Zhang, S. De Backer, and P

. Scheunders, “Noise-resistant wavelet-based Bayesian fusion of multispectral and hyperspectral images,” IEEE Trans. Geosci. and Remote Sens., vol. 47, no. 11, pp. 3834 –3843, Nov. 2009.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 63 / 67

slide-79
SLIDE 79

Bayesian fusion of multi-band images Conclusion

References II

  • Y. Zhang, A. Duijster, and P

. Scheunders, “A Bayesian restoration approach for hyperspectral images,” IEEE Trans. Geosci. and Remote Sens., vol. 50, no. 9, pp. 3453 –3462, Sep. 2012.

  • G. A. Licciardi, A. Villa, M. M. Khan, and J. Chanussot, “Image fusion and spectral

unmixing of hyperspectral images for spatial improvement of classification maps,” in Proc. IEEE Int. Conf. Geosci. Remote Sens. (IGARSS), 2012.

  • G. A. Licciardi, M. M. Khan, J. Chanussot, A. Montanvert, L. Condat, and
  • C. Jutten, “Fusion of hyperspectral and panchromatic images using

multiresolution analysis and nonlinear PCA band reduction,” EURASIP J. Adv. Signal Process., vol. 2012, pp. 1–17, 2012.

  • G. Vivone, L. Alparone, J. Chanussot, M. Dalla Mura, Garzelli, and G. Licciardi,

“Multi-resolution analysis and component substitution techniques for hyperspectral pansharpening,” in Proc. IEEE Int. Conf. Geosci. Remote Sens. (IGARSS), July 2014, pp. 2649–2652.

  • M. Sim˜
  • es, J. Bioucas Dias, L. B. Almeida, and J. Chanussot, “Hyperspectral

image superresolution: An edge-preserving convex formulation,” in Proc. IEEE

  • Int. Conf. Image Processing (ICIP), 2014.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 64 / 67

slide-80
SLIDE 80

Bayesian fusion of multi-band images Conclusion

References III

  • M. Sim˜
  • es, J. Bioucas Dias, L. Almeida, and J. Chanussot, “A convex formulation

for hyperspectral image superresolution via subspace-based regularization,” 2015, to appear.

  • M. Veganzones, M. Simoes, G. Licciardi, J. M. Bioucas Dias, and J. Chanussot,

“Hyperspectral super-resolution of locally low rank images from complementary multisource data,” in Proc. IEEE Int. Conf. Image Processing (ICIP), 2014.

  • O. Bern, A. Tielens, P

. Pilleri, and C. Joblin, “Non-negative matrix factorization pansharpening of hyperspectral data: An application to mid-infrared astronomy,” in Proc. IEEE GRSS Workshop Hyperspectral Image SIgnal Process.: Evolution in Remote Sens. (WHISPERS), 2010, pp. 1–4.

  • N. Yokoya, T. Yairi, and A. Iwasaki, “Coupled nonnegative matrix factorization

unmixing for hyperspectral and multispectral data fusion,” IEEE Trans. Geosci. and Remote Sens., vol. 50, no. 2, pp. 528–537, Feb. 2012.

  • Q. Wei, N. Dobigeon, and J.-Y. Tourneret, “Bayesian fusion of multi-band images,”

ArXiv preprint 1307.5996, 2013. ——, “Bayesian fusion of multispectral and hyperspectral images with unknown sensor spectral response,” in Proc. IEEE Int. Conf. Image Processing (ICIP), Paris, France, Oct. 2014, invited paper.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 65 / 67

slide-81
SLIDE 81

Bayesian fusion of multi-band images Conclusion

References IV

——, “Bayesian fusion of hyperspectral and multispectral images,” in Proc. IEEE

  • Int. Conf. Acoust., Speech, and Signal Processing (ICASSP), Florence, Italy, May

2014. ——, “Bayesian fusion of multispectral and hyperspectral images using a block coordinate descent method,” in Proc. IEEE GRSS Workshop Hyperspectral Image SIgnal Process.: Evolution in Remote Sens. (WHISPERS), Tokyo, Japan, June 2015, submitted.

  • Q. Wei, J. M. Bioucas Dias, N. Dobigeon, and J.-Y. Tourneret, “Hyperspectral and

multispectral image fusion based on a sparse representation,” IEEE Trans.

  • Geosci. and Remote Sens., 2015, to appear.

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 66 / 67

slide-82
SLIDE 82

Bayesian fusion of multi-band images Conclusion

Bayesian fusion of multi-band images

Beyond pansharpening

Nicolas Dobigeon

Joint work with Qi Wei, Jean-Yves Tourneret and Jose M. Bioucas-Dias University of Toulouse, IRIT/INP-ENSEEIHT & T´ eSA http://www.enseeiht.fr/˜dobigeon

Winter School “Search for Latent Variables: ICA, Tensors, and NMF” Villard de Lans, February 2-4 2015

Nicolas Dobigeon Winter School “Search for Latent Variables”, Feb. 2-4 2015 67 / 67