on robust estimation and smoothing
play

On robust estimation and smoothing 2 with spatial and tonal kernels - PowerPoint PPT Presentation

M I Geometric Properties from Incomplete Data, Dagstuhl, March 2004 A 1 On robust estimation and smoothing 2 with spatial and tonal kernels 3 4 Pavel Mr azek Joachim Weickert and Andr es Bruhn 5 6 Mathematical Image Analysis


  1. M I Geometric Properties from Incomplete Data, Dagstuhl, March 2004 A 1 On robust estimation and smoothing 2 with spatial and tonal kernels 3 4 Pavel Mr´ azek Joachim Weickert and Andr´ es Bruhn 5 6 Mathematical Image Analysis Group, Saarland University 7 http://www.mia.uni-saarland.de 8 9 10

  2. M I Estimation from noisy data A General idea : � 1 constant signal u + noise n = measured noisy data f i Task: estimate the signal u 2 N N Gaussian noise → mean u = 1 � � ( u − f j ) 2 minimizes E ( u ) = f j � 3 N j =1 j =1 Noise with heavier tails → employ robust error norms � 4 5 6 7 8 9 10

  3. M I Estimation from noisy data A General idea : � 1 constant signal u + noise n = measured noisy data f i Task: estimate the signal u 2 N N Gaussian noise → mean u = 1 � � ( u − f j ) 2 minimizes E ( u ) = f j � 3 N j =1 j =1 Noise with heavier tails → employ robust error norms � 4 N � | u − f j | 2 � � M-estimators : minimize E ( u ) = Ψ 5 j =1 Examples: 6 Ψ( s 2 ) = s 2 → mean 7 −5 −4 −3 −2 −1 0 1 2 3 4 5 Ψ( s 2 ) = | s | → median 8 −5 −4 −3 −2 −1 0 1 2 3 4 5 1 Ψ( s 2 ) = 1 − e − s 2 /λ 2 → “mode” 9 0 −5 −4 −3 −2 −1 0 1 2 3 4 5 l Ψ( s 2 ) = min( s 2 , λ 2 ) → “mode” 10 0 −5 −4 −3 −2 −1 0 1 2 3 4 5

  4. M I Local estimates A Data f i measured at position x i → find local estimate u i as � 1 N 2 � | u − f j | 2 � | x i − x j | 2 � � � u i = argmin Ψ w u j =1 3 s 2 < θ � 1 1 w ( s 2 ) = hard window 4 0 otherwise 0 −3 −2 −1 0 1 2 3 1 w ( s 2 ) = e − s 2 /θ 2 soft window (Chu et al. (1996)) 5 0 −3 −2 −1 0 1 2 3 6 7 8 9 10

  5. M I Local estimates A Data f i measured at position x i → find local estimate u i as � 1 N 2 � | u − f j | 2 � | x i − x j | 2 � � � u i = argmin Ψ w u j =1 3 s 2 < θ � 1 1 w ( s 2 ) = hard window 4 0 otherwise 0 −3 −2 −1 0 1 2 3 1 w ( s 2 ) = e − s 2 /θ 2 soft window (Chu et al. (1996)) 5 0 −3 −2 −1 0 1 2 3 6 Local M-smoothers : minimize 7 N � � | u i − f j | 2 � | x i − x j | 2 � � � E ( u ) = Ψ w 8 i =1 j ∈B ( i ) 9 10

  6. M I Local M-smoothers A N 1 � � | u i − f j | 2 � | x i − x j | 2 � � � Gradient descent on E ( u ) = Ψ : w � i =1 j ∈B ( i ) 2 i − τ ∂E u k +1 = u k 3 i ∂u i � = u k Ψ ′ � | u k i − f j | 2 � | x i − x j | 2 � 2 ( u k � i − τ i − f j ) w 4 j ∈B ( i ) � | x i − x j | 2 �� � Ψ ′ � | u k i − f j | 2 � u k � = 1 − 2 τ w 5 i j ∈B ( i ) � Ψ ′ � | u k i − f j | 2 � | x i − x j | 2 � 6 � + 2 τ w f j j ∈B ( i ) 7 8 9 10

  7. M I Local M-smoothers A N 1 � � | u i − f j | 2 � | x i − x j | 2 � � � Gradient descent on E ( u ) = Ψ : w � i =1 j ∈B ( i ) 2 i − τ ∂E u k +1 = u k 3 i ∂u i � = u k Ψ ′ � | u k i − f j | 2 � | x i − x j | 2 � 2 ( u k � i − τ i − f j ) w 4 j ∈B ( i ) � | x i − x j | 2 �� � Ψ ′ � | u k i − f j | 2 � u k � = 1 − 2 τ w 5 i j ∈B ( i ) � Ψ ′ � | u k i − f j | 2 � | x i − x j | 2 � 6 � + 2 τ w f j j ∈B ( i ) 7 1 Setting τ = j ∈B ( i ) Ψ ′ � i − f j | 2 � � | x i − x j | 2 � | u k 2 � w 8 9 10

  8. M I Local M-smoothers A N 1 � � | u i − f j | 2 � | x i − x j | 2 � � � Gradient descent on E ( u ) = Ψ : w � i =1 j ∈B ( i ) 2 i − τ ∂E u k +1 = u k 3 i ∂u i � = u k Ψ ′ � | u k i − f j | 2 � | x i − x j | 2 � 2 ( u k � i − τ i − f j ) w 4 j ∈B ( i ) � | x i − x j | 2 �� � Ψ ′ � | u k i − f j | 2 � u k � = 1 − 2 τ w 5 i j ∈B ( i ) � Ψ ′ � | u k i − f j | 2 � | x i − x j | 2 � 6 � + 2 τ w f j j ∈B ( i ) 7 1 Setting τ = j ∈B ( i ) Ψ ′ � i − f j | 2 � � | x i − x j | 2 � | u k 2 � w 8 j ∈B ( i ) Ψ ′ � | u k i − f j | 2 � | x i − x j | 2 � � � w f j 9 u k +1 we obtain = i j ∈B ( i ) Ψ ′ � | u k i − f j | 2 � � | x i − x j | 2 � � w 10

  9. M I The big picture A 1 GLOBAL M-estimators � | u − f j | 2 � � j Ψ 2 3 4 WINDOWED local M-smoothers | u i − f j | 2 � | x i − x j | 2 � � � � � j Ψ w 5 i 6 7 8 LOCAL 9 10

  10. M I Bayesian framework / regularization theory A Take the local M-estimator, decrease the spatial window size � 1 � 1 x i = x j | x i − x j | 2 � � → w = 0 otherwise 2 N � | u i − f i | 2 � � ⇒ minimizing E D ( u ) = Ψ has a trivial solution u i = f i . 3 i =1 4 5 6 7 8 9 10

  11. M I Bayesian framework / regularization theory A Take the local M-estimator, decrease the spatial window size � 1 � 1 x i = x j | x i − x j | 2 � � → w = 0 otherwise 2 N � | u i − f i | 2 � � ⇒ minimizing E D ( u ) = Ψ has a trivial solution u i = f i . 3 i =1 Bayesian / regularization framework : combine with prior knowledge, 4 assumptions e.g. about the smoothness of u 5 E ( u ) = α E D ( u ) + (1 − α ) E S ( u ) � 6 | u i − f i | 2 � |∇ u | 2 � � � = α Ψ D + (1 − α ) Ψ S i 7 Covered are e.g. 8 Mumford-Shah functional: Ψ D ( s 2 ) = s 2 , Ψ S ( s 2 ) = min( s 2 , λ 2 ) � graduated nonconvexity of Blake and Zisserman � nonlinear diffusion filters (Perona-Malik, TV flow, ...) 9 � 10

  12. M I The big picture A 1 GLOBAL M-estimators � | u − f j | 2 � � j Ψ 2 3 4 WINDOWED local M-smoothers | u i − f j | 2 � | x i − x j | 2 � � � � � j Ψ w 5 i 6 7 8 LOCAL Bayesian / regularization theory | u − f | 2 � |∇ u | 2 � � � � α Ψ D + (1 − α ) Ψ S 9 DATA TERM SMOOTHNESS TERM 10

  13. M I Smoothness term from larger window A Express smoothness using discrete samples: 1 ≈ � N � � |∇ u | 2 � �� j ∈N ( i ) | u i − u j | 2 � E S ( u ) = Ψ S i =1 Ψ S isotropic � 2 E S ( u ) ≈ � N | u i − u j | 2 � � � j ∈N ( i ) Ψ S anisotropic � i =1 3 4 5 6 7 8 9 10

  14. M I Smoothness term from larger window A Express smoothness using discrete samples: 1 ≈ � N � � |∇ u | 2 � �� j ∈N ( i ) | u i − u j | 2 � E S ( u ) = Ψ S i =1 Ψ S isotropic � 2 E S ( u ) ≈ � N | u i − u j | 2 � � � j ∈N ( i ) Ψ S anisotropic � i =1 3 Increase the window size ⇒ the smoothness term becomes N 4 � � | u i − u j | 2 � | x i − x j | 2 � � � E S ( u ) = Ψ w i =1 j ∈B ( i ) 5 which can be minimized by iterating 6 j ∈B ( i ) Ψ ′ � | u k i − u k j | 2 � | x i − x j | 2 � u k � � w j u k +1 7 = i j ∈B ( i ) Ψ ′ � | u k i − u k j | 2 � � | x i − x j | 2 � � w 8 9 10

  15. M I Smoothness term from larger window A Express smoothness using discrete samples: 1 ≈ � N � � |∇ u | 2 � �� j ∈N ( i ) | u i − u j | 2 � E S ( u ) = Ψ S i =1 Ψ S isotropic � 2 E S ( u ) ≈ � N | u i − u j | 2 � � � j ∈N ( i ) Ψ S anisotropic � i =1 3 Increase the window size ⇒ the smoothness term becomes N 4 � � | u i − u j | 2 � | x i − x j | 2 � � � E S ( u ) = Ψ w i =1 j ∈B ( i ) 5 which can be minimized by iterating 6 j ∈B ( i ) Ψ ′ � | u k i − u k j | 2 � � | x i − x j | 2 � u k � w j u k +1 = i 7 | u k i − u k j ∈B ( i ) Ψ ′ � j | 2 � � | x i − x j | 2 � � w 8 Bilateral filter of Tomasi and Manduchi 9 not exactly a gradient descent on E S ( u ) : � samples u i not independent and each needs a different descent step τ 10 alternative functional proposed by Elad: windowed smoothness + local data term �

  16. M I The big picture A 1 GLOBAL M-estimators � | u − f j | 2 � � j Ψ 2 3 4 WINDOWED local M-smoothers bilateral filter | u i − f j | 2 � | x i − x j | 2 � | u i − u j | 2 � | x i − x j | 2 � � � � � � � � � j Ψ w j Ψ S w 5 i i 6 7 | u i − u j | 2 � � � j ∈N ( i ) Ψ S j ∈N ( i ) | u i − u j | 2 � �� Ψ S 8 LOCAL Bayesian / regularization theory | u − f | 2 � |∇ u | 2 � � � � α Ψ D + (1 − α ) Ψ S 9 DATA TERM SMOOTHNESS TERM 10

  17. M I The big picture A 1 GLOBAL M-estimators � | u − f j | 2 � � j Ψ 2 3 4 WINDOWED local M-smoothers bilateral filter | u i − f j | 2 � | x i − x j | 2 � | u i − u j | 2 � | x i − x j | 2 � � � � � � � � � j Ψ w + j Ψ S w 5 i i DATA TERM SMOOTHNESS TERM 6 7 | u i − u j | 2 � � � j ∈N ( i ) Ψ S j ∈N ( i ) | u i − u j | 2 � �� Ψ S 8 LOCAL Bayesian / regularization theory | u − f | 2 � |∇ u | 2 � � � � α Ψ D + (1 − α ) Ψ S 9 DATA TERM SMOOTHNESS TERM 10

  18. M I The unifying functional A 1 � � | u i − f j | 2 � | x i − x j | 2 � � � E ( u ) = α Ψ D w D 2 i j | u i − u j | 2 � | x i − x j | 2 � � � + (1 − α ) Ψ S w S 3 covers many nonlinear filters for robust signal estimation and image smoothing 4 � new filter combinations possible � assumptions about signal and noise → Ψ D , w D , Ψ S , w S , α � 5 6 7 8 9 10

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend