DeepSkyFusion* multisource data fusion from astronomical images - - PowerPoint PPT Presentation

deepskyfusion
SMART_READER_LITE
LIVE PREVIEW

DeepSkyFusion* multisource data fusion from astronomical images - - PowerPoint PPT Presentation

Obs. Strasbourg - Apr 13, 2007 PASEO DeepSkyFusion* multisource data fusion from astronomical images Andr Jalobeanu PASEO Research Group MIV team @ LSIIT, Illkirch, France *part of SpaceFusion project ANR Jeunes Chercheurs 2005-2008


slide-1
SLIDE 1

DeepSkyFusion*

multisource data fusion from astronomical images

André Jalobeanu PASEO Research Group MIV team @ LSIIT, Illkirch, France

PASEO

*part of SpaceFusion project ANR “Jeunes Chercheurs” 2005-2008

  • Obs. Strasbourg - Apr 13, 2007
slide-2
SLIDE 2

Outline

Introduction Objectives (Astronomy) Proposed approach

Bayesian inference from multiple observations Accurate forward modeling Preliminary results: validation in 1D and 2D Estimating, storing and using uncertainties

Extensions to multispectral/hyperspectral data

Multispectral imaging Integral Field Spectroscopy (Muse)

slide-3
SLIDE 3

Why multisource data fusion?

Multisource data fusion:

Optimally combine all observations into a single object Preserve all the information from the original data set

  • Increase resolution if needed
  • Compute the uncertainties
  • Reconstruct the 3D geometry if required

Enhance the object quality (optional)

Denoise or deblur depending on the degradation

Problem: lots of data, same object!

Images are recorded with various: position, orientation (pose)

sensors (resolution, noise, bad pixels)

  • bserving conditions (transparency, haze)

instruments (blur, distortions)

slide-4
SLIDE 4

The SpaceFusion project

Name Position, lab time% André Jalobeanu CR, LSIIT/MIV Illkirch 90% Christophe Collet PU, LSIIT/MIV Illkirch 40% Mireille Louys MCF, LSIIT/MIV Illkirch 40% Fabien Salzenstein MCF, InESS Strasbourg 40% Françoise Nerry CR, LSIIT/TRIO Illkirch 20% Albert Bijaoui / Eric Slezak A/AA, OCA Nice 10% Bernd Vollmer A, Obs. Strasbourg 10% J.A. Gutiérrez (06)

  • F. Brisc, M. Joshi (07)

postdoc, Illkirch 20 mo total Projet ANR “Jeunes Chercheurs 2005” (French Research Agency) 3-year grant, Jan 2006 - Dec 2008

slide-5
SLIDE 5

SpaceFusion objectives

Produce a corrected, super- resolved image in astronomy Reconstruct a reflectance function in remote sensing Recover the geometry of small bodies and planetary surfaces Reconstruct both reflectance and topography in Earth/Space Sciences

slide-6
SLIDE 6

Astronomy: image reconstruction

Input: Multiple images (single band, multispectral or IFS)

Virtual Observatory

Optical, UV, IR / calibrated or not / missing or corrupted data Output: Single model: 2D (image-like), well-sampled Uncertainties (simplified inverse covariance) If applicable, spatial and spectral super-resolution

DeepSkyFusion

Multisource data fusion and 2D super-resolution Astronomy & Astrophysics

Preserve the information from the original data set: photometry, astrometry, noise statistics

slide-7
SLIDE 7

DeepSkyFusion project: task layout

Astronomy, 2D - DeepSkyFusion

0D 2D Theory

Rendering (2D image formation) Parameter estimation via marginalization Covariance simplification Recursive inference Noise modeling, handling, estimation DeepSkyFusion Missing data and outlier management Registration Error propagation (processing chains) Efficient optimization schemes Image fusion, multi, known param. Image fusion, single, known param. Multispectral data fusion Intensity/uncertainty quantization Dynamic range fusion Dynamic range fusion with PSF Format specification PSF inference Efficient image modeling

defined task partly completed task completed and validated task

  • Oct. 2006

status

independent pixel processing 2D image processing

slide-8
SLIDE 8

Related, existing methods (2D)

Frame coaddition, Drizzling Increase the SNR and the dynamic range Mosaicing, “Montage” Increase the field of view Super-resolution De-aliasing Bandwidth extrapolation Multiframe restoration Partial turbulence compensation

slide-9
SLIDE 9

The proposed approach (single band images)

Use Bayesian inference to recover a single object from all observations In 2D: recover a well-sampled image, possibly super-resolved Check the validity of this approach in 1D & 2D (first results) Provide uncertainty estimates, allow for recursive data processing

  • f

Y

  • ,
  • h

resolution camera pose parameters internal camera parameters global PSF sensor sampling grid noise

  • bserved

image geometric mapping rendering coefficients

slide-10
SLIDE 10

Bayesian inference

p(θ | observations) = p(observations | θ)× p(θ) p(observations)

evidence (useful for model comparison) likelihood image formation model prior model (a priori knowledge about the observed object) parameters of interest (unknown solution)

! All parameters are random variables ! Bayesian inference functional optimization / approximations ! Deterministic optimization techniques for speed

OBJECTIVE: posterior probability density function (pdf)

slide-11
SLIDE 11

Probabilistic data fusion vs. averaging

Average Probabilistic fusion

Result #1 Result #2

Take into account uncertainties: variance, correlations

Formal framework for the combination of multiple observations

Propagate uncertainties

From the observation noise to the end result! Downside: algorithms ought to account for input uncertainties

slide-12
SLIDE 12

Bayesian inference from multiple observations

Y

  • Y
  • Y
  • X

unknown

  • bject
  • bservations

image model parameters camera parameters

Forward modeling:

  • Object modeling (image, 3D geometry, reflectance...)
  • Image formation = rendering + degradations

Bayesian inference:

  • Estimate the optimal object given all observations
  • Integrate w.r.t. all nuisance variables: “marginalization”
  • Evaluate the uncertainties:

Covariance matrix, if Gaussian approx. of the posterior pdf

  • Model selection and assessment
slide-13
SLIDE 13

Band-limited astronomical image model

Model of the unknown object (2D image) Choose an appropriate parametrization and topology

  • Sampling grid size , lattice, ...

Understand the sampling theorem!

  • Don’t try to go beyond the Nyquist rate:
  • ptical frequency cut-off!
  • Near-optimal sampling, band-limited:

Target image = sum of B-Spline 3 kernels

Constrain and stabilize this inverse problem

  • Use smoothness priors to avoid noise amplification

(oversampled areas will undergo a deconvolution...)

  • Use efficiently designed prior models

to help preserve useful information while filtering the noise

O u t p u t p i x e l s i z e < i n p u t b l u r s i z e / 2

slide-14
SLIDE 14

Directed graphical models:

Node: set of random variables No incoming arrow: prior pdf Arrow: dependence (causality) Set of incoming arrows: conditional pdf Joint distribution: P(X,Y,,,) = P() P() P(X|) n P(n) P(Yn|X,n,) Posterior marginal: P(X|Y) ∝ P(X,Y,,,) ddd

Simplified graphical model

Y

  • n observations

X

  • scene model

resolution camera pose parameters

  • bserved

image scene model prior model parameters

marginalization

  • Y

Random variable Observed variable

slide-15
SLIDE 15

Resampling scheme

Image formation scheme

For each sensor n:

where = function of g, h

Deformation Geometric mapping g (param: external , internal , density ) Convolution with the Point Spread Function (PSF) h Sampling on a discrete pixel grid Lj: BSpline interpolation coefficients such that target image X=L!s

Ip = (T ◦g)⋆h(πp)

In

p =

  • j

λn

pjLj

g(u) = ǫ

  • Wθgs

α(u)+bθ

  • [MaxEnt 2006]
slide-16
SLIDE 16

Rendering coefficient computation

Target PSF (B-Spline 3) Instrument PSF (at pixel p)

Sensor space Model (world) space

g

Sampling grid (irregular)

Warped, shifted PSF

p j

n instruments

Jacobian of the geometric mapping g

Rendering coefficients:

nth sensor PSF @ pixel p

λn

pj =

  • ∆n

p

  • hn

p

  • ∆n

p

  • g(πp)−j
  • notation ∆n

p = J−1 πp

ation at pixel p no

[ADA 2006]

slide-17
SLIDE 17

Inversion (known parameters)

Φ(L) = ωDX2 = ωRL2

derivative (1st order) BSpline kernel

1) Optimum: functional minimization problem Energy U = - log P(L | Y,,,)

R = DS = d ⋆ s⋆

Prior term: Data term:

U(L) = Φ(L) +

  • n

Dn(L, Y n) Dn(L, Y n) = 1 2

  • p
  • λn

p · L − Y n p

  • vn

p 2

2) Uncertainties: 2nd derivatives at the optimum

slide-18
SLIDE 18

Inversion algorithm

Initialization (... looks like drizzling!) Iterative, deterministic optimization

Quadratic form: conjugate gradient minimization of U(L)

∇U = 2ωRT RL +

  • n,p

λn

p

λn

p · L − Y n p

vn

p

  • [∇2U] = 2ωRT R +
  • n,p

1 vn

p

(λn

p) (λn p)T

Gradient Hessian

Lj ←

  • n,p(λn

p)j Y n p

  • n,p(λn

p)j

slide-19
SLIDE 19

First results: 2X super-resolution (1D signals)

2 observations blur, noise 1/3 sample shift Reconstructed signal 95% confidence interval data points ideal signal De-aliasing + regularized deconvolution

real PSF

target PSF 1

  • 1

(model space) [MaxEnt 2006]

slide-20
SLIDE 20

2D Super-Resolution results (astronomy)

1 2 3

pointing pattern (model space) 1 2 3

Experimental setting: 4 noisy images

undersampled (factor 2), shifted (1/2 pixel)

Image fusion result (mean)

Inverse covariance of the result

diagonal terms, and near-diagonal (covariance) terms

[ADA 2006]

slide-21
SLIDE 21

Comparison with existing techniques

Pixel interlacing result Reference image Initialization Drizzling result

Image fusion result (mean)

slide-22
SLIDE 22

Real data? (HST WFPC2)

4 images of the same scene, shifted, undersampled

bright dots: mostly cosmics

DITHER HANDBOOK - Example #2: EDGE-ON GALAXY NUCLEUS NGC 4565

slide-23
SLIDE 23

Uncertainties and error propagation

Error propagation from the source to the end result Computing uncertainties Simplifying uncertainties Storing uncertainties Using uncertainties

slide-24
SLIDE 24

Error modeling: from source to result

image processing

input pixel

  • utput

pixel

pdf transformed pdf

Input noise: stochastic process

(observation = realization of a random variable)

Several additive processes, zero-mean Stochastic independence between pixels (white noise) Stationary process

Processing algorithm: deterministic transform Output noise: stochastic process

(result = realization of a random variable)

Additive & zero mean assumption, stochastic independence, stationarity

slide-25
SLIDE 25

Uncertain knowledge of the pixels

Diagonal terms Inverse variance, if all other terms are zero (1/pixel) Near-diagonal terms Nearest neighbors: left/right and up/down interactions (2/pixel) Longer range: diagonal interactions (2/pixel) Long-range terms (n/pixel) Have to be zero (more convenient, but realistic)

Posterior pdf P(X | Y) prop. to exp -U(X) Gaussian approximation of the posterior pdf Uncertainties: 2nd derivatives of U at the optimum “inverse covariance matrix”

slide-26
SLIDE 26

Computing and propagating uncertainties

Inverse covariance matrix computation Second derivatives of the energy U(X) at the optimum Sparse matrix

(the interaction range depends

  • n the size of the blur kernel h)

Recursive processing and uncertainty propagation Use the simplified posterior (mean, approx. inv. covariance)

as a prior density for subsequent data processing

Recursive (vs. batch) data fusion: allow for model updates

[Σ−1

X ] = S−1[∇2U]S

Φ(L)(k+1) = LT S[˜ Σ−1 (k)

X

]S L

slide-27
SLIDE 27

Approximating uncertainties

Inverse covariance approximation

Goal: provide a 1st-order Markovian model

Drop the long-range interactions for simplicity Preserve the variance and nearest neighbor covariance information Minimize a distance between Gaussian distributions, e.g.:

covariance inverse covariance approx. true

inf

,γ DKL

  • G(0, ΣX), G(0, ˜

ΣX)

  • Inverse covariance matrix:

sparse, but not enough...

slide-28
SLIDE 28

Storing uncertainties

Optimum NxN pixels Uncertainties: NxN x ( 1 + 2 [+ 2]) parameters

self vertical horizontal diagonal diagonal type of interaction

limited redundancy: 3 or 5

slide-29
SLIDE 29

Processing images with uncertainties

Bayesian approach to image analysis Goal: compute the posterior pdf of the desired parameters Use the extended data term: posterior pdf from the fusion Update existing statistical methods (Bayesian or not)

to use this extended term - no other changes required!

If possible, provide uncertainties on the analysis result as well Example: prediction X=F(parameters), F=star profile...

D(X) =

  • p

1 2dp(X − ˆ X)2

p + ch p(X − ˆ

X)p(X − ˆ X)r(p) + cv

p(X − ˆ

X)p(X − ˆ X)u(p)

  • Extended data term:

usual term extra off-diagonal terms (horizontal and vertical covariances)

slide-30
SLIDE 30

Extension to multispectral data

Fusion of single band images taken with different filters? The “single grayscale object” assumption is no longer valid The object must be multispectral...

and so must be the image formation model.

Fusion of arbitrary multispectral images? Generalization: collection of registered single-band images. Frequency response overlap: source of redundancy Use the filter transmission function for each band Find an optimal representation /

  • ptimal spectral sampling
slide-31
SLIDE 31

Multispectral uncertainties

Add interactions between bands

... ...

Uncertainties: NxN x (M + 2M [+2M] + M-1) parameters Optimum M bands

  • f NxN pixels

limited redundancy: max. 4 or 6

slide-32
SLIDE 32

Extension to integral field spectroscopy

Model the entire data acquisition process! Lens arrays or fibers (optical degradation and sampling, spaxels) Spectrum formation and sampling on the detector (pixels)

OASIS optical layout (CFHT)

detector grid spectrum continuum image

slide-33
SLIDE 33

Conclusions

Accomplishments

Bayesian approach to data fusion in 2D, single band Validation in 1D/2D (bandlimited signals & images)

Super-resolution from multiple undersampled observations Uncertainty computation (inverse covariance matrix)

To do...

Automatic calibration (registration and prior) Better priors for astronomical images (sparse Bayes) Multispectral data fusion Validation on real data (HST WFPC2, Virtual Observatory) Format specification (single and multi-band) Post-processing: simple image analysis library...