Image resampling and constraint formulation for multi-frame - - PowerPoint PPT Presentation

image resampling and constraint formulation for multi
SMART_READER_LITE
LIVE PREVIEW

Image resampling and constraint formulation for multi-frame - - PowerPoint PPT Presentation

Image resampling and constraint formulation for multi-frame super-resolution restoration S. Borman & R. L. Stevenson Laboratory for Image and Signal Analysis Department of Electrical Engineering University of Notre Dame Indiana, USA


slide-1
SLIDE 1

Image resampling and constraint formulation for multi-frame super-resolution restoration

  • S. Borman & R. L. Stevenson

Laboratory for Image and Signal Analysis Department of Electrical Engineering University of Notre Dame Indiana, USA

slide-2
SLIDE 2

Introduction

Consider the problem of multi-frame super-resolution image restoration:

  • Estimate super-resolved images from multiple images of the scene
  • Relative scene/camera motion provides essential constraints for restoration

Objectives:

  • Generalize observation model
  • Ideally no changes to restoration framework should be necessary
  • Easy to incorporate spatially varying degradations
  • Accommodate arbitrary motion fields
slide-3
SLIDE 3

Overview

  • Multi-frame super-resolution introduction
  • Image resampling theory
  • Multi-frame super-resolution observation model
  • Show relationship between the two!
  • Example observation model including lens PSF and pixel integration
  • Example projected images and constraints
  • Extensions
slide-4
SLIDE 4

Multi-Frame Super-Resolution Restoration

  • Given: noisy, under-sampled low-resolution image sequence
  • Estimate: Super-resolved images (bandwidth extrapolation)
  • Use information from multiple observed images in estimate
  • Sub-pixel registration of multiple imagesa provides restoration constraints
  • Model observation process ( lens / sensor / noise )
  • Include a-priori knowledge

aThink IMAGE WARPING or RESAMPLING

slide-5
SLIDE 5

Image Resampling

  • Objective: Sampling of discrete image under coordinate transformation
  • Discrete input image (texture): f(u) with u = [u v]T ∈ Z2
  • Discrete output image (warped): g(x) with x = [x y]T ∈ Z2
  • Forward mapping: H :u −

→ x

  • Simplistic approach: ∀x∈ Z2,

g(x) = f(H−1(x))

  • Problems:
  • 1. H−1(x) need not fall on sample points (interpolation required)
  • 2. H−1(x) may undersample f(u) resulting in aliasing

(This occurs when the the mapping results in minification)

slide-6
SLIDE 6

Theoretical Image Resampling Pipeline (Heckbert)

  • 1. Continuous reconstruction (interpolation) of input image (texture):

˜ f(u) = f(u) ⊛ r(u) =

  • k∈Z2

f(k) · r(u − k)

  • 2. Warp the continuous reconstruction:

˜ g(x) = ˜ f

  • H−1(x)
  • 3. Pre-filter warped image to prevent aliasing in sampling step:

˜ g′(x) = ˜ g(x) ⊛ p(x) =

  • ˜

g(α) · p(x − α) dα

  • 4. Sample to produce discrete output image:

g(x) = ˜ g′(x)

for x ∈ Z2

slide-7
SLIDE 7

Heckbert’s Resampling Pipeline

Discrete texture Reconstruction filter Geometric transform Prefilter Sample Discrete resampled image

r(u) H(u) p(x) ∆ f(u) ˜ f(u) ˜ g(x) ˜ g′(x) g(x)

slide-8
SLIDE 8

Realized Image Resampling Pipeline (Heckbert)

  • Never reconstruct continuous images:

g(x) = ˜ g′(x)

for x ∈ Z2

=

  • ˜

f

  • H−1(α)
  • · p(x − α) dα

=

  • p(x − α)
  • k∈Z2

f(k) · r

  • H−1(α) − k

=

  • k∈Z2

f(k) ρ(x, k)

where

ρ(x, k) =

  • p(x − α) · r
  • H−1(α) − k

is a spatially varying resampling filter.

slide-9
SLIDE 9

Realized Image Resampling Pipeline (Heckbert)

  • The resampling filter

ρ(x, k) =

  • p(x − α) · r
  • H−1(α) − k

is described in terms of the warped reconstruction filter r and integration in

x-space.

  • With a change of variables α = H(u) and integrating in u-space the

resampling filter can be expressed in terms of the warped pre-filter p

ρ(x, k) =

  • p(x − H(u)) · r (u − k)
  • ∂H

∂u

  • du
  • |∂H/∂u| is the determinant of the Jacobian
slide-10
SLIDE 10

Super-Resolution Observation Model

Generalize super-resolution observation model of Schultz & Stevenson:

  • Optics

– Diffraction limited PSF , defocus, aberrations, etc...

  • Sensor

– Spatial reponse, temporal integration

  • Must be able to accommodate spatially varying degradations
  • Need technique for general motion maps!
slide-11
SLIDE 11

Multi-Frame Observation Model

  • N observations g(i)(x), i ∈ {1, 2, . . . , N} of underlying scene f(u)
  • Related via geometric transformations H(i) (scene/camera motion)
  • Spatially varying PSFs h(i) (may vary across observations)
  • LSV PSFs can include lens and sensor responses, defocus, motion blur etc.

g(i)(x)

  • x∈Z2 =
  • h(i)(x, α) · f
  • H(i)−1(α)
slide-12
SLIDE 12

Multi-Frame Observation Model

Geometric transform Sample Lens/sensor PSF Scene (continuous) Observed images (discrete)

f(u) H(1) H(2) H(N) h(1) h(2) h(N) ∆(1) ∆(2) ∆(N) g(1)(x) g(2)(x) g(N)(x)

slide-13
SLIDE 13

Super-Resolution Restoration Observation Model

  • Relate observations to image to be restored (estimate of scene)
  • Discretized approximation f(k) of scene f(u) using interpolation kernel hr

f(u) ≈

  • k

f(k) · hr (u − k)

  • Combining with earlier result,

g(i)(x)

  • x∈Z2

=

  • h(i)(x, α) · f
  • H(i)−1(α)

=

  • h(i)(x, α)
  • k

f(k) · hr

  • H(i)−1(α) − k
  • Identical in form to resampling expressions
  • Can thus find spatially variant resampling filter relating g(i)(x) to f(k)
slide-14
SLIDE 14

Comparison of Resampling and Restoration Models

Resampling Restoration Discrete texture

f(u) f(u)

Discrete scene estimate Reconstruction filter

r(u) hr(u)

Interpolation kernel Geometric transform

H(u) H(i)(u)

Scene/camera motion Anti-alias prefilter

p(x) h(i)(x, α)

Observation SVPSF Warped output image

g(x) g(i)(x)

Observed images

  • Resampling filter

ρ(x, k) =

  • p(x − H(u)) · r (u − k)
  • ∂H

∂u

  • du
  • Observation filter ρ(i)(x, k) =
  • h(i)(x, H(u)) · hr (u − k)
  • ∂H(i)

∂u

  • du
  • But how do we find the observation filter in practice?
slide-15
SLIDE 15

Determining the Observation Filter

  • Measure or model combined PSFs

– Lens, sensor spatial integration, temporal integration, defocus, etc.

  • Estimate or model inter-frame registration (geometric transforms)

– Motion estimation from observed scenes, observation geometry, etc.

  • Present an example:

– PSF accounts for diffraction limited optical system and sensor spatial integration – Geometric transforms based on controlled imaging geometry – Demonstrate method for finding observation filter

slide-16
SLIDE 16

Optical System Modeling

Assumptions:

  • Diffraction limited
  • Incoherent illumination
  • Circular exit pupil

⇒ Radially symmetric point spread function h(r′) =

  • 2J1(r′)

r′ 2 , with r′ = (π/λN)r J1(·) Bessel function first kind, wavelength λ, f-number N, radial distance r.

slide-17
SLIDE 17

Example – Optical System PSF

  • λ=550nm (green), N = 2.8
  • First zero of Airy Disk at 1.22λN = 1.88µm

−4 −2 2 4 −4 −2 2 4 0.2 0.4 0.6 0.8 1 x [microns] Lens PSF y [microns] Lens PSF x [microns] y [microns] −3 −2 −1 1 2 3 −3 −2 −1 1 2 3

slide-18
SLIDE 18

Optical Transfer Function H(ρ′) =   

2 π

  • cos−1 (ρ′) − ρ′

1 − ρ′2

  • ,

for ρ′ ≤ 1

0,

  • therwise

ρ′ = ρ/ρc

normalized radial spatial frequency

ρ

radial spatial frequency

ρc = 1/λN

radial spatial frequency cut-off

slide-19
SLIDE 19

Example – Optical Transfer Function

  • Radial frequency cut-off ρc = 1/λN = 649.35 lines/mm
  • Nyquist sampling requires > 2 × ρc ≈ 1300 samples/mm
  • Sample spacing < 0.77µm! ... (but pixel spatial integration is crude LPF)

−800 −600 −400 −200 200 400 600 800 −500 500 0.2 0.4 0.6 0.8 1 v [cycles/mm] Optical Transfer Function u [cycles/mm] 100 200 300 400 500 600 700 800 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 Optical Transfer Function Radial Profile [cycles/mm]

slide-20
SLIDE 20

Combined Transfer Function

  • LPF due to spatial integration over light-sensitive area of pixel
  • Assume ideal pixel response (100% fill-factor, flat response, dimension T )
  • Sinc from pixel integration has first zeros at sampling frequency (fs = 1/T )
  • Need zero response for f > fs/2 for correct anti-aliasing

−800 −600 −400 −200 200 400 600 800 −500 500 0.2 0.4 0.6 0.8 1 v [cycles/mm] Pixel Modulation Transfer Function u [cycles/mm] −800 −600 −400 −200 200 400 600 800 −500 500 0.2 0.4 0.6 0.8 1 v [cycles/mm] Combined Modulation Transfer Function u [cycles/mm]

slide-21
SLIDE 21

Combined Pixel Response Function

  • hcombined = hlens ∗ hpixel
  • Significant “leakage” even with ideal lens/pixel responses
  • Reality is much worse (optics and sensors)

−10 −5 5 10 −10 −5 5 10 0.2 0.4 0.6 0.8 1 x [microns] Combined Pixel and Lens Response y [microns] −2.5 −2 −1.5 −1 −0.5 0.5 1 1.5 2 2.5 10

−3

10

−2

10

−1

10 Combined Pixel and Lens Response Cross Section Distance [pixels]

slide-22
SLIDE 22

Determining the Warped Pixel Response

  • Backproject PSF h(x, α) from g(x) to restored image using H−1 (red)
  • Determine bounding region for image of h(x, α) under backprojection (cyan)
  • ∀ S-R pixels u in region, project via H and find h(x, H(u)) (green)
  • Scale according to Jacobian and interpolation kernel hr then integrate over u

Observed image g(x) Restored image f(u)

H−1 H

slide-23
SLIDE 23

Example – Warped Pixel Response

  • Projective transformation H
  • PSF associated with observed pixel at location (x, y) = (0, 0) (left)
  • Regularly sampled warped PSF on restoration grid (u, v) (right)
  • 10 samples/restoration pixel (right). Projection of samples shown (left)

−1 −0.8 −0.6 −0.4 −0.2 0.2 0.4 0.6 0.8 1 −1 −0.8 −0.6 −0.4 −0.2 0.2 0.4 0.6 0.8 1 41.5 42 42.5 43 66.5 67 67.5 68

slide-24
SLIDE 24

Example – Projective Transformation

  • Projective transformation H for geometric mapping

x = φ(u, v) =

h11u+h12v+h13 h31u+h32v+1

y = ψ(u, v) =

h21u+h22v+h23 h31u+h32v+1

  • Linear when using homogeneous coordinates

    xw yw w     =     h11 h12 h13 h21 h22 h23 h31 h32 1         u v 1    

  • Jacobian

dx dy =

  • ∂φ

∂u ∂φ ∂v ∂ψ ∂u ∂ψ ∂v

  • du dv
slide-25
SLIDE 25

Discrete Observation Model

  • Pixels in each observed image are related to the restored image

g(i)(x)

  • x∈Z2 =
  • k∈Z2

f(k) ρ(i)(x, k)

  • In the S-R restoration this constitutes one row of the observation matrix A
  • g(1)T g(2)T · · · g(N)T T

= Af

where the images are lexicographically ordered column vectors

  • These equations are solved to determine the S-R estimate f
  • Typically regularized solution methods are used (ill-posed inverse problem)
  • No changes to restoration algorithm are required!
slide-26
SLIDE 26

Example Demonstrating Observation Filter

  • Create two simulated views with custom raytracer (with lens/sensor model)
slide-27
SLIDE 27

Example ctnd...

  • Assume 2D “scene” in X-Y plane, lens/sensor PSF discussed earlier
  • Camera matrices known, so compute homography induced by the plane
  • Projective transformation H relates the image pair g(1) and g(2)

g(1) g(2)

slide-28
SLIDE 28

Example ctnd...

  • Compute ρ(x, k) relating first (plan) view g(1) to the restoration coincident

with second (oblique) view g(2) assuming 4:1 image expansion for restoration

  • Compute backprojection AT g(1) (should be oblique, expanded)

g(1) AT g(1)

slide-29
SLIDE 29

Example ctnd...

  • Given estimate ˆ

f of f compute projection A ˆ f (should look like g(1)) ˆ f A ˆ f

slide-30
SLIDE 30

Features

  • Applicable to general, spatially varying PSFs
  • Spatially varying responses present no additional difficulties
  • Measured PSF response data are easily incorporated
  • Once ρ(i)(x, k) are determined, restoration proceeds as usual
  • A-matrix projection/backprojection is completely characterized by ρ(i)(x, k)
  • No change whatsoever to super-resolution restoration framework
  • More realistic observation models ⇒ better restoration
slide-31
SLIDE 31

Extensions and Challenges

  • Accelerate computing of ρ(i)(x, k)

– Computing the projected PSFs is costly – Immediate saving: tighter bounding box on backprojection of PSF – Use scanline algorithms from CG literature

  • Accuracy of observation model vs space complexity of storing ρ(i)(x, k)
  • General image motion

– Displacement vector field or locally modeled regions – Either way we have a local coordinate transformation (no problem!) – Occlusions?

slide-32
SLIDE 32

Summary

  • Close relationship between image resampling pipeline and the multi-frame

image restoration problem

  • Used ideas from resampling to develop method for incorporating very general
  • bservation models in the restoration framework
  • Illustrated application to restoration with non-ideal lens/sensor response
slide-33
SLIDE 33

The end...

“If it wasn’t for the last minute, nothing would get done...”