noise image reconstruction with noise
play

Noise, Image Reconstruction with Noise ! EE367/CS448I: Computational - PowerPoint PPT Presentation

Noise, Image Reconstruction with Noise ! EE367/CS448I: Computational Imaging and Display ! stanford.edu/class/ee367 ! Lecture 10 ! Gordon Wetzstein ! Stanford University ! Whats a Pixel? ! photon to electron converter ! ! photoelectric effect! !


  1. Noise, Image Reconstruction with Noise ! EE367/CS448I: Computational Imaging and Display ! stanford.edu/class/ee367 ! Lecture 10 ! Gordon Wetzstein ! Stanford University !

  2. What’s a Pixel? ! photon to electron converter ! ! photoelectric effect! ! source: Molecular Expressions ! wikipedia !

  3. What’s a Pixel? ! • ! microlens: focus light on photodiode ! • ! color filter: select color channel ! • ! quantum efficiency: ~50% ! • ! fill factor: fraction of surface area used for light gathering ! • ! photon-to-charge conversion and source: Molecular Expressions ! ADC includes noise! !

  4. ISO (“film speed”) ! sensitivity ! of sensor ! to light – ! digital gain ! bobatkins.com !

  5. � Noise � • noise is (usually) bad! � • many sources of noise: heat, electronics, amplifier gain, photon to electron conversion, pixel defects, read, … � • let’s start with something simple: fixed pattern noise �

  6. ! ! ! ! ! ! Fixed Pattern Noise ! • ! dead or “hot” pixels, dust, … ! • ! remove with dark frame calibration: ! I = I captured ! I dark I white ! I dark on RAW image! ! not JPEG (nonlinear) ! Emil Martinec !

  7. Noise � • other than that, different noise follows different statistical distributions, these two are crucial: � • Gaussian Gaussian � • Poisson Poisson �

  8. ! ! ! Gaussian Noise ! • ! thermal, read, amplifier ! ! • ! additive, signal-independent, zero-mean ! + =

  9. Gaussian Noise ! ( ) x ! N x ,0 b = x + ! ( ) • ! with i.i.d. Gaussian noise: ! ! ! N 0, " 2 (independent and identically-distributed) ! additive Gaussian noise !

  10. Gaussian Noise � ( ) x ∼ N x ,0 b = x + η ( ) • with i.i.d. Gaussian noise: � η ∼ N 0, σ 2 (independent and identically-distributed) � ( ) b ∼ N x , σ 2 ( ) 2 − b − x 1 ( ) = p b | x , σ 2 σ 2 2 πσ 2 e

  11. Gaussian Noise ! ( ) x ! N x ,0 b = x + ! ( ) • ! with i.i.d. Gaussian noise: ! ! ! N 0, " 2 (independent and identically-distributed) ! • ! Bayes’ rule: ! ( ) b ! N x , ! 2 ( ) 2 # b # x 1 ( ) = p b | x , ! 2 ! 2 2 "! 2 e

  12. Gaussian Noise - MAP ! ( ) x ! N x ,0 b = x + ! ( ) • ! with i.i.d. Gaussian noise: ! ! ! N 0, " 2 (independent and identically-distributed) ! • ! Bayes’ rule: ! • ! ( ) maximum-a-posteriori estimation: ! b ! N x , ! 2 ( ) 2 # b # x 1 ( ) = p b | x , ! 2 ! 2 2 "! 2 e

  13. Gaussian Noise – MAP “Flat” Prior ! • ! trivial solution (not useful in practice): ! ( ) = 1 p x

  14. Gaussian Noise – MAP Self Similarity Prior ! • ! Gaussian “denoisers” like non-local means and other self- similarity priors actually solve this problem: !

  15. General Self Similarity Prior ! • ! generic proximal operator for function f(x): ! • ! proximal operator for some image prior !

  16. General Self Similarity Prior ! • ! can use self-similarity as general image prior (not just for denoising) ! • ! proximal operator for some image prior !

  17. General Self Similarity Prior ! • ! can use self-similarity as general image prior (not just for denoising) ! ! 2 = " / ! ! = " / ! (h parameter in most NLM ! implementations is std. dev.) ! • ! proximal operator for some image prior !

  18. Image Reconstruction with Gaussian Noise � ( ) x ∼ N Ax ,0 b = Ax + η ( ) • with i.i.d. Gaussian noise: � η ∼ N 0, σ 2 (independent and identically-distributed) �

  19. Image Reconstruction with Gaussian Noise � ( ) x ∼ N Ax ,0 b = Ax + η ( ) • with i.i.d. Gaussian noise: � η ∼ N 0, σ 2 (independent and identically-distributed) � ( ) b ∼ N Ax , σ 2 ( ) 2 − b − Ax 1 ( ) = p b | x , σ 2 σ 2 2 πσ 2 e

  20. Image Reconstruction with Gaussian Noise ! ( ) x ! N Ax ,0 b = Ax + ! ( ) • ! with i.i.d. Gaussian noise: ! ! ! N 0, " 2 (independent and identically-distributed) ! • ! Bayes’ rule: ! • ! ( ) maximum-a-posteriori estimation: ! b ! N Ax , ! 2 ( ) 2 # b # Ax 1 ( ) = p b | x , ! 2 ! 2 2 "! 2 e • ! regularized least squares (use ADMM) !

  21. Scientific Sensors ! • ! e.g., Andor iXon Ultra 897: cooled to -100° C ! • ! scientific CMOS & CCD ! • ! reduce pretty much all noise, except for photon noise !

  22. What is Photon (or Shot) Noise? � • noise caused by the fluctuation when the incident light is converted to charge ( detection ) � • also observed in the light emitted by a source, i.e. due to particle nature of light ( emission ) � • same for re-emission à cascading Poisson processes are also described by a Poisson process [Teich and Saleh 1998] �

  23. Photon Noise � • noise is signal dependent! � • for N measured photo-electrons � σ 2 = N σ = N • standard deviation is , variance is � N • mean is � f ( k ; N ) = N k e − N • Poisson distribution � k !

  24. Photon Noise - SNR ! signal-to-noise ratio SNR = N N SNR in dB ! = N photons: ! N ! = 2N photons: ! 2 N 1,000 10,000 N nonlinear! !

  25. Photon Noise - SNR ! signal-to-noise ratio SNR = N N ! = N photons: ! N ! = 2N photons: ! 2 N nonlinear! ! wikipedia !

  26. Maximum Likelihood Solution for Poisson Noise ! • ! image formation: ! • ! probability of measurement i : ! • ! joint probability of all M measurements (use notation trick ): !

  27. ! ! ! ! Maximum Likelihood Solution for Poisson Noise ! • ! log-likelihood function: ! • ! gradient: !

  28. ! Maximum Likelihood Solution for Poisson Noise ! • ! Richardson-Lucy algorithm: iterative approach to ML estimation for Poisson noise ! • ! simple idea: ! 1. ! at solution, gradient will be zero ! 2. ! when converged, further iterations will not change, i.e. !

  29. ! Richardson-Lucy Algorithm ! • ! equate gradient to zero ! = 0 • ! rearrange so that 1 is on one side of equation !

  30. ! Richardson-Lucy Algorithm ! • ! equate gradient to zero ! = 0 • ! rearrange so that 1 is on one side of equation ! • ! set equal to !

  31. ! Richardson-Lucy Algorithm ! • ! for any multiplicative update rule scheme: ! • ! start with positive initial guess (e.g. random values) ! • ! apply iteration scheme ! • ! future updates are guaranteed to remain positive ! • ! always get smaller residual ! • ! RL multiplicative update rules: !

  32. Richardson-Lucy Algorithm - Deconvolution ! Blurry & Noisy Measurement ! RL Deconvolution !

  33. Richardson-Lucy Algorithm � • what went wrong? � • Poisson deconvolution is a tough problem, without priors it’s pretty much hopeless � • let’s try to incorporate the one prior we have learned: total variation �

  34. ! ! Richardson-Lucy Algorithm + TV ! • ! log-likelihood function: ! • ! gradient: !

  35. ! ! ! Richardson-Lucy Algorithm + TV ! • ! gradient of anisotropic TV term ! • ! this is “dirty”: possible division by 0! !

  36. ! ! Richardson-Lucy Algorithm + TV ! • ! follow the same logic as RL, RL+TV multiplicative update rules: ! • ! 2 major problems & solution “hacks”: ! 1. ! still possible division by 0 when gradient is zero ! ! set fraction to 0 if gradient is 0 2. ! multiplicative update may become negative! ! ! only work with (very) small values for lambda

  37. (A Dirty but Easy Approach to) Richardson-Lucy with a TV Prior ! RL+TV, lambda=0.005 ! RL+TV, lambda=0.025 ! RL ! Log Residual ! Mean Squared Error ! Measurements !

  38. Signal-to-Noise Ratio (SNR) � standard deviation of pixel value = µ mean pixel value signal signal � SNR = σ noise noise � PQ e t = PQ e t + Dt + N r 2 P = incident photon flux (photons/pixel/sec) Q e = quantum efficiency t = eposure time (sec) D = dark current (electroncs/pixel/sec), including hot pixels N r = read noise (rms electrons/pixel), including fixed pattern noise

  39. Signal-to-Noise Ratio (SNR) ! standard deviation of pixel value = µ mean pixel value signal ! SNR = ! noise ! PQ e t = signal-dependent ! PQ e t + Dt + N r 2 signal-independent ! P = incident photon flux (photons/pixel/sec) Q e = quantum efficiency t = eposure time (sec) D = dark current (electroncs/pixel/sec), including hot pixels N r = read noise (rms electrons/pixel), including fixed pattern noise

  40. Next: Compressive Imaging ! • ! single pixel camera ! • ! compressive hyperspectral imaging ! • ! compressive light field imaging ! Marwah et al., 2013 ! Wakin et al. 2006 !

  41. References and Further Reading � • https://en.wikipedia.org/wiki/Shot_noise � • L. Lucy, 1974 “An iterative technique for the rectification of observed distributions”. Astron. J. 79, 745–754. � • W. Richardson, 1972 “Bayesian-based iterative method of image restoration” J. Opt. Soc. Am. 62, 1, 55–59. � • N. Dey, L. Blanc-Feraud, C. Zimmer, Z. Kam, J. Olivo-Marin, J. Zerubia, 2004 “A deconvolution method for confocal microscopy with total variation regularization”, In IEEE Symposium on Biomedical Imaging: Nano to Macro, 1223–1226 � • M. Teich, E. Saleh, 1998, “Cascaded stochastic processes in optics”, Traitement du Signal 15(6) � • please also read the lecture notes, especially for the “clean” ADMM derivation for solving the maximum likelihood estimation of Poisson please also read the lecture notes, especially for the “clean” ADMM derivation for solving the maximum likelihood estimation of Poisson reconstruction with TV prior! reconstruction with TV prior! �

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend