image resampling and constraint formulation for multi
play

Image resampling and constraint formulation for multi-frame - PowerPoint PPT Presentation

Image resampling and constraint formulation for multi-frame super-resolution restoration S. Borman & R. L. Stevenson Laboratory for Image and Signal Analysis Department of Electrical Engineering University of Notre Dame Indiana, USA


  1. Image resampling and constraint formulation for multi-frame super-resolution restoration S. Borman & R. L. Stevenson Laboratory for Image and Signal Analysis Department of Electrical Engineering University of Notre Dame Indiana, USA

  2. Introduction Consider the problem of multi-frame super-resolution image restoration: • Estimate super-resolved images from multiple images of the scene • Relative scene/camera motion provides essential constraints for restoration Objectives: • Generalize observation model • Ideally no changes to restoration framework should be necessary • Easy to incorporate spatially varying degradations • Accommodate arbitrary motion fields

  3. Overview • Multi-frame super-resolution introduction • Image resampling theory • Multi-frame super-resolution observation model • Show relationship between the two! • Example observation model including lens PSF and pixel integration • Example projected images and constraints • Extensions

  4. Multi-Frame Super-Resolution Restoration • Given: noisy, under-sampled low-resolution image sequence • Estimate: Super-resolved images (bandwidth extrapolation) • Use information from multiple observed images in estimate • Sub-pixel registration of multiple images a provides restoration constraints • Model observation process ( lens / sensor / noise ) • Include a-priori knowledge a Think IMAGE WARPING or RESAMPLING

  5. Image Resampling • Objective: Sampling of discrete image under coordinate transformation • Discrete input image (texture): f ( u ) with u = [ u v ] T ∈ Z 2 • Discrete output image (warped): g ( x ) with x = [ x y ] T ∈ Z 2 • Forward mapping: H : u �− → x • Simplistic approach: ∀ x ∈ Z 2 , g ( x ) = f ( H − 1 ( x )) • Problems: 1. H − 1 ( x ) need not fall on sample points (interpolation required) 2. H − 1 ( x ) may undersample f ( u ) resulting in aliasing (This occurs when the the mapping results in minification)

  6. Theoretical Image Resampling Pipeline (Heckbert) 1. Continuous reconstruction (interpolation) of input image (texture): ˜ � f ( u ) = f ( u ) ⊛ r ( u ) = f ( k ) · r ( u − k ) k ∈ Z 2 2. Warp the continuous reconstruction: g ( x ) = ˜ H − 1 ( x ) � � ˜ f 3. Pre-filter warped image to prevent aliasing in sampling step: �� g ′ ( x ) = ˜ ˜ g ( x ) ⊛ p ( x ) = g ( α ) · p ( x − α ) d α ˜ 4. Sample to produce discrete output image: for x ∈ Z 2 g ′ ( x ) g ( x ) = ˜

  7. Heckbert’s Resampling Pipeline f ( u ) Discrete texture r ( u ) Reconstruction filter ˜ f ( u ) H ( u ) Geometric transform g ( x ) ˜ p ( x ) Prefilter ˜ g ′ ( x ) Sample ∆ g ( x ) Discrete resampled image

  8. Realized Image Resampling Pipeline (Heckbert) • Never reconstruct continuous images: for x ∈ Z 2 g ′ ( x ) g ( x ) = ˜ �� ˜ H − 1 ( α ) � � = f · p ( x − α ) d α �� � H − 1 ( α ) − k � � = p ( x − α ) f ( k ) · r d α k ∈ Z 2 � = f ( k ) ρ ( x , k ) k ∈ Z 2 where �� H − 1 ( α ) − k � � ρ ( x , k ) = p ( x − α ) · r d α is a spatially varying resampling filter.

  9. Realized Image Resampling Pipeline (Heckbert) • The resampling filter �� H − 1 ( α ) − k � � ρ ( x , k ) = p ( x − α ) · r d α is described in terms of the warped reconstruction filter r and integration in x -space. • With a change of variables α = H ( u ) and integrating in u -space the resampling filter can be expressed in terms of the warped pre-filter p � � �� ∂H � � ρ ( x , k ) = p ( x − H ( u )) · r ( u − k ) � d u � � ∂ u � • | ∂H/∂ u | is the determinant of the Jacobian

  10. Super-Resolution Observation Model Generalize super-resolution observation model of Schultz & Stevenson: • Optics – Diffraction limited PSF , defocus, aberrations, etc... • Sensor – Spatial reponse, temporal integration • Must be able to accommodate spatially varying degradations • Need technique for general motion maps!

  11. Multi-Frame Observation Model • N observations g ( i ) ( x ) , i ∈ { 1 , 2 , . . . , N } of underlying scene f ( u ) • Related via geometric transformations H ( i ) (scene/camera motion) • Spatially varying PSFs h ( i ) (may vary across observations) • LSV PSFs can include lens and sensor responses, defocus, motion blur etc. �� � � � H ( i ) − 1 ( α ) g ( i ) ( x ) h ( i ) ( x , α ) · f x ∈ Z 2 = d α � �

  12. Multi-Frame Observation Model f ( u ) Scene (continuous) Geometric transform H (1) H (2) H ( N ) h (1) h (2) h ( N ) Lens/sensor PSF ∆ (1) ∆ (2) ∆ ( N ) Sample g (1) ( x ) g (2) ( x ) g ( N ) ( x ) Observed images (discrete)

  13. Super-Resolution Restoration Observation Model • Relate observations to image to be restored (estimate of scene) • Discretized approximation f ( k ) of scene f ( u ) using interpolation kernel h r � f ( u ) ≈ f ( k ) · h r ( u − k ) k • Combining with earlier result, �� � � � H ( i ) − 1 ( α ) g ( i ) ( x ) h ( i ) ( x , α ) · f = d α � � x ∈ Z 2 �� � � H ( i ) − 1 ( α ) − k � h ( i ) ( x , α ) = f ( k ) · h r d α k • Identical in form to resampling expressions • Can thus find spatially variant resampling filter relating g ( i ) ( x ) to f ( k )

  14. Comparison of Resampling and Restoration Models Resampling Restoration f ( u ) f ( u ) Discrete texture Discrete scene estimate r ( u ) h r ( u ) Reconstruction filter Interpolation kernel H ( i ) ( u ) H ( u ) Geometric transform Scene/camera motion h ( i ) ( x , α ) p ( x ) Anti-alias prefilter Observation SVPSF g ( i ) ( x ) g ( x ) Warped output image Observed images � ∂H � �� � d u • Resampling filter ρ ( x , k ) = p ( x − H ( u )) · r ( u − k ) � ∂ u � � � ∂H ( i ) • Observation filter ρ ( i ) ( x , k ) = h ( i ) ( x , H ( u )) · h r ( u − k ) �� � d u � � ∂ u • But how do we find the observation filter in practice?

  15. Determining the Observation Filter • Measure or model combined PSFs – Lens, sensor spatial integration, temporal integration, defocus, etc. • Estimate or model inter-frame registration (geometric transforms) – Motion estimation from observed scenes, observation geometry, etc. • Present an example: – PSF accounts for diffraction limited optical system and sensor spatial integration – Geometric transforms based on controlled imaging geometry – Demonstrate method for finding observation filter

  16. Optical System Modeling Assumptions: • Diffraction limited • Incoherent illumination • Circular exit pupil ⇒ Radially symmetric point spread function � 2 � 2 J 1 ( r ′ ) , with r ′ = ( π/λN ) r h ( r ′ ) = r ′ J 1 ( · ) Bessel function first kind, wavelength λ , f-number N , radial distance r .

  17. Example – Optical System PSF • λ =550nm (green), N = 2 . 8 • First zero of Airy Disk at 1 . 22 λN = 1 . 88 µ m Lens PSF Lens PSF −3 1 −2 0.8 −1 0.6 y [microns] 0 0.4 0.2 1 0 2 4 2 4 2 3 0 0 −2 −2 −3 −2 −1 0 1 2 3 −4 −4 y [microns] x [microns] x [microns]

  18. Optical Transfer Function  � � cos − 1 ( ρ ′ ) − ρ ′ � for ρ ′ ≤ 1 2 1 − ρ ′ 2 ,  π H ( ρ ′ ) = 0 , otherwise  ρ ′ = ρ/ρ c normalized radial spatial frequency ρ radial spatial frequency ρ c = 1 /λN radial spatial frequency cut-off

  19. Example – Optical Transfer Function • Radial frequency cut-off ρ c = 1 /λN = 649 . 35 lines/mm • Nyquist sampling requires > 2 × ρ c ≈ 1300 samples/mm • Sample spacing < 0 . 77 µ m! ... (but pixel spatial integration is crude LPF) Optical Transfer Function Optical Transfer Function Radial Profile 1 0.9 1 0.8 0.8 0.7 0.6 0.6 0.5 0.4 0.4 0.2 0.3 0 0.2 500 800 600 400 0 0.1 200 0 −200 −500 −400 0 −600 0 100 200 300 400 500 600 700 800 −800 u [cycles/mm] v [cycles/mm] [cycles/mm]

  20. Combined Transfer Function • LPF due to spatial integration over light-sensitive area of pixel • Assume ideal pixel response (100% fill-factor, flat response, dimension T ) • Sinc from pixel integration has first zeros at sampling frequency ( f s = 1 /T ) • Need zero response for f > f s / 2 for correct anti-aliasing Pixel Modulation Transfer Function Combined Modulation Transfer Function 1 1 0.8 0.8 0.6 0.6 0.4 0.4 0.2 0.2 0 0 500 500 800 800 600 600 400 400 0 0 200 200 0 0 −200 −200 −500 −400 −500 −400 −600 −600 −800 −800 u [cycles/mm] u [cycles/mm] v [cycles/mm] v [cycles/mm]

  21. Combined Pixel Response Function • h combined = h lens ∗ h pixel • Significant “leakage” even with ideal lens/pixel responses • Reality is much worse (optics and sensors) Combined Pixel and Lens Response Combined Pixel and Lens Response Cross Section 0 10 1 0.8 −1 10 0.6 0.4 0.2 −2 10 0 10 5 10 5 0 −3 10 0 −5 −5 −2.5 −2 −1.5 −1 −0.5 0 0.5 1 1.5 2 2.5 −10 −10 y [microns] x [microns] Distance [pixels]

  22. Determining the Warped Pixel Response • Backproject PSF h ( x , α ) from g ( x ) to restored image using H − 1 (red) • Determine bounding region for image of h ( x , α ) under backprojection (cyan) • ∀ S-R pixels u in region, project via H and find h ( x , H ( u )) (green) • Scale according to Jacobian and interpolation kernel h r then integrate over u Observed image g ( x ) Restored image f ( u ) H − 1 H

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend