a homotopy method in regularization of total variation
play

A homotopy method in regularization of total variation denoising - PowerPoint PPT Presentation

A homotopy method in regularization of total variation denoising problems Luis A. Melara Colorado College Joint work with: A. J. Kearsley (NIST), R.A. Tapia (Rice University) MCSD, NIST, G AITHERSBURG , M ARYLAND USA Tuesday 30 August 2005


  1. A homotopy method in regularization of total variation denoising problems Luis A. Melara Colorado College Joint work with: A. J. Kearsley (NIST), R.A. Tapia (Rice University) MCSD, NIST, G AITHERSBURG , M ARYLAND USA Tuesday 30 August 2005

  2. OUTLINE • Motivation • Previous Work • Problem Formulation • Computational Method • Results • Conclusions

  3. Image Restoration Three categories: • Statistical methods. • Transform-based methods. • Optimization-based methods.

  4. Optimization-based methods • Images containing sharp edges. • Image solves constrained optimization problem.

  5. Bounded Variational Seminorm Let x ∈ Ω ⊂ R 2 . Let u : Ω → R 2 . � � 2 � 2 � � � ∂u ∂u � � � Ω �∇ u � 2 = + d x � ∂x ∂y Ω

  6. Contaminated image u obs Observed image u obs : u obs ( x ) = u exact ( x ) + η ( x ) . • η ∈ H 1 (Ω) is noise. • u exact is exact image. Define � 1 / 2 �� Ω | u exact − u obs | 2 d x σ = > 0 .

  7. Equality Constrained Optimization Problem (EC) Find u ∈ H 1 (Ω) s.t. � min Ω �∇ u � 2 d x u ∈ H 1 (Ω) � � u − u obs � 2 L 2 (Ω) − σ 2 � 1 / 2 = 0 . s.t.

  8. Lagrangian Functional To solve EC, we write Lagrangian functional: � � � u − u obs � 2 L 2 (Ω) − σ 2 � ℓ ( u, λ ) = Ω �∇ u � 2 d x + λ/ 2 , where λ ∈ R is Lagrange multiplier.

  9. Euler-Lagrange Equations Find u ∈ H 1 (Ω) s.t. � � ∇ u M ( u, λ ) = −∇ · + λ ( u − u obs ) = 0 , �∇ u � 2 for x ∈ Ω and ∂u ∂n = 0 , x ∈ Γ , Γ is boundary of Ω .

  10. Previous Work Fatemi, Osher and Rudin (1992). � � ∂u ∇ u ∂s = −∇ · + λ ( u − u obs ) , �∇ u � 2 • s > 0 , x ∈ Ω , • u ( x, y, 0) given and • ∂u/∂n = 0 for x ∈ Γ .

  11. At steady state solution: 1 Ω �∇ u � 2 − ∇ u obs · ∇ u � λ = d x . 2 σ 2 �∇ u � 2

  12. Regularization If ∇ u = 0 M ( u, λ ) is undefined. ⇒ Regularized Lagrangian functional: � � 2 + ǫ 2 d x �∇ u � 2 ℓ ǫ ( u, λ ) = Ω � u − u obs � 2 L 2 (Ω) − σ 2 � � + λ/ 2 .

  13. Associated Euler-Lagrange Equations Find u ∈ H 1 (Ω) s.t.   ∇ u M ǫ ( u, λ ) = −∇ ·  + λ ( u − u obs ) = 0 ,   �  �∇ u � 2 2 + ǫ 2 for x ∈ Ω with ∂u ∂n = 0 , x ∈ Γ For ǫ > 0 , M ǫ ( u, λ ) is well-defined.

  14. Related Previous Work Dobson, Omen and Vogel (1996). Solve 1 � � 2 � u − u obs � 2 �∇ u � 2 2 + ǫ 2 , min L 2 (Ω) + γ u ∈ H 1 (Ω) Ω where γ ∈ R is small penalization parameter (Majava, 2001). Newton’s method is used.

  15. Hessian of Lagrangian Hessian A of Lagrangian given by: � Am, m � =       ( ∇ u · ∇ m ) ∇ u ∇ m �      + ∇ ·  −∇ ·  m       � 3 �      �∇ u � 2 2 + ǫ 2 �� Ω �∇ u � 2 2 + ǫ 2   + λ m 2 d x , for m ∈ H 1 (Ω) . Denote J ǫ ( u ; m, m ) = � Am, m � .

  16. Idea Newton’s method ⇒ Kantorovich Theorem. Show direct relationship between regularizaton parameter ǫ and radius of Kantorovich ball.

  17. Proposition 1 Given m, p ∈ H 1 (Ω) , then the following inequality holds, ||| J ǫ ( m ; · , · ) − J ǫ ( p ; · , · ) ||| ≤ c 1 ( ǫ ) || m − p || H 1 (Ω) , (1) with � � 1 1 , 1 c 1 ( ǫ ) = 3 N ( Q ) ǫ 2 /h · max , , D 2 D 2 D 1 · D 2 Q k ⊂ Ω 1 2 where D 1 = �∇ m � 2 2 + ǫ 2 , and D 2 = �∇ p � 2 2 + ǫ 2 .

  18. Proposition 2 Let u ∈ H 1 (Ω) . If the Jacobian J ǫ ( u ; · , · ) is symmetric positive definite, then it has a bounded inverse with bound, �� − 1 ǫ 2 � � ||| J ǫ ( u ; · , · ) − 1 ||| ≤ c 2 ( ǫ ) = min ( D 3 ) 3 , λ , Q k ⊂ Ω where � �∇ u � 2 2 + ǫ 2 . D 3 =

  19. Definition Let B p ( u (0) , r ) = { u : � u − u (0) � p < r } , be p − ball of radius r centered at u (0) .

  20. Theorem (Kantorovich) Consider a function, L ǫ : R n → R n that is defined on a convex set C ⊆ R n . Let the Jacobian operator of L ǫ be J ǫ and further assume that J ǫ is a Lipschitz function with Lipschitz constant α L . Assume that u (0) is some starting point selected from C and that the following are all satisfied for some u (0) ∈ C , 1. ||| J ǫ ( u ; · , · ) − J ǫ ( v ; · , · ) ||| ≤ α L � u − v � , ∀ u, v ∈ C , 2. ||| J ǫ ( u (0) ; · , · ) − 1 ||| ≤ α 1 , 3. ||| J ǫ ( u (0) ; · , · ) − 1 L ǫ ( u (0) ; · ) ||| ≤ α 2 . If δ = α L α 1 α 2 ≤ 1 / 2 , and if B p ( u (0) , r ) ⊆ C with √ r = α 2 (1 − 1 − 2 δ ) /δ, then the Newton sequence { u ( n ) } given by u ( n +1) = u ( n ) − J ǫ ( u ( n ) ; · , · ) − 1 L ǫ ( u ( n ) ; · ) , is well-defined, remains in the ball B p ( u (0) , r ) , and converges to the unique solution of L ǫ ( u ∗ ; · ) = 0 inside B p ( u (0) , r ) .

  21. Summary Large ǫ ⇒ nice convergence but undesirable solution u ∗ . Small ǫ ⇒ bad convergence but desirable solution u ∗ . From Proposition 1 , Proposition 2 and Kantorovich Theorem: u ( n ) ( u (0) ) − u ∗ Large ǫ 0 ⇒ → 0 . u ( n ) ( u ∗ u ∗ 0 ) − Reduce ǫ 0 ⇒ → 1 . u ( n ) ( u ∗ u ∗ 1 ) − Reduce ǫ 0 ⇒ → 2 . . . . . . .

  22. Homotopy ε 1 r( ) u (0) u * ε 3 r( ) ε r( ) 2

  23. From Propositions 1 and 2 : α L and α 1 depend on ǫ radius r of B p ( u, r ) depends on ǫ . ⇒

  24. Numerical Implementation Let N ( Q ) ˆ � Q = (0 , 1) × (0 , 1) , Ω = Q k , k =1 N ( Q ) = number of elements in Ω . Define affine map F : F (ˆ x ) = D ˆ x + d = x , x ∈ ˆ ˆ Q and x ∈ Q k , with

  25. � � � � h 0 ih D = , d = , 0 h jh where i, j = 1 , . . . , N . N = number of partitions along x − and y − axes. h = 1 /N is mesh size along x − and y − axes.

  26. Visual Description ^ Q Q F k where ˆ Q = (0 , 1) × (0 , 1) , and Q k = ( ih, jh ) × (( i + 1) h, ( j + 1) h ) ⊂ Ω

  27. Function composition Use affine map F , � u ◦ F − 1 � u ( F − 1 ( x )) . u ( x ) = ˆ ( x ) = ˆ By chain rule: ∇ u ( x ) = ∇ F − 1 · ˆ � F − 1 ( x ) � ∇ ˆ u , with ∇ = ( ∂/∂x, ∂/∂y ) , and ˆ ∇ = ( ∂/∂ ˆ x, ∂/∂ ˆ y ) .

  28. Function Approximation Approximation of u : N ( h ) � u = u k ϕ k , k =1 where u k = u ( ih, jh ) , for some i, j , N ( h ) = number of nodes in Ω . The k th basis function ϕ k ∈ P 1 where P 1 = span { 1 , x, y }

  29. Objective Function N ( Q ) � � � � 2 + ǫ 2 d x 2 + ǫ 2 d x . �∇ u � 2 �∇ u � 2 � = Ω Q k k =1 Use F to obtain in each Q k : � � � � �∇ F − 1 ˆ 2 + ǫ 2 det ˆ 2 + ǫ 2 d x = �∇ u � 2 u � 2 ∇ ˆ ∇ Fd ˆ x . ˆ Q k Q

  30. Equality Constraint Similarly,  N ( Q )  1 = 1 �� � � Ω | u − u obs | 2 − σ 2 | u − u obs | 2 − σ 2  . �  2 2 Q k k =1 Again, use mapping F . Idea: � � u | 2 det ˆ | u | 2 d x = Q | ˆ ∇ F d ˆ x . ˆ Q k

  31. Optimization Method Let u ∈ H 1 (Ω) . Let � � 2 + ǫ 2 d x , �∇ u � 2 f ( u ) = Ω and �� � Ω | u − u obs | 2 − σ 2 g ( u ) = 1 / 2 . Augmented Lagrangian L ( u, λ, ρ ) : L ( u, λ, ρ ) = f ( u ) + λ g ( u ) + ( ρ/ 2) g ( u ) 2 , where λ ∈ R and ρ ∈ R .

  32. Notation: Discrete Approximations • Denote f ( u ) by f ( u ) , u ∈ R N ( h ) , • Denote g ( u ) by g ( u ) , u ∈ R N ( h ) , • Simplify L ( u, λ, ρ ) and denote: L ( u ) = L ( u , λ, ρ ) , u ∈ R N ( h ) , N ( h ) = number of nodes in Ω .

  33. Gradient of L ( u ) Find u ∈ R N ( h ) s.t. ∇ L ( u ) ∈ R N ( h ) , ∇ L ( u ) = 0 , with ( ∇ L ( u )) i = ( ∇ f ( u )) i + λ ( ∇ g ( u )) i + 2 ρ g ( u ) ( ∇ g ( u )) i = ∂f ( u ) /∂ u i + λ∂g ( u ) /∂ u i + 2 ρ g ( u ) ∂g ( u ) /∂ u i . for i = 1 , . . . , N ( h )

  34. Hessian of L ( u ) The ij th component of Hessian A ∈ R N ( h ) × N ( h ) of L ( u ) is: ∂ 2 L ( u ) = , A ij ∂ u i ∂ u j where i, j = 1 , . . . , N ( h ) .

  35. Newton’s Method Given u (0) , Newton sequence { u ( n ) } is generated by: u ( n +1) = u ( n ) + α s ( n ) , where α ∈ R and s ( n ) ∈ R N ( h ) solves A ( n ) s ( n ) = −∇ L ( u ( n ) ) .

  36. Sufficient Decrease Criteria Set α := 1 . If inequality, L ( u ( n ) + α s ( n ) ) < L ( u ( n ) ) + 10 − 4 α ∇ L ( u ( n ) ) · s ( n ) , not met, then α := α/ 2 . Otherwise, update: u ( n +1) = u ( n ) + α s ( n ) , and reset α := 1 .

  37. Lagrange multiplier update Least Squares update formula: λ = −∇ f ( u ) · ∇ g ( u ) ∇ g ( u ) · ∇ g ( u ) . Diagonalized multiplier method, convergence properties follow from Tapia (1977).

  38. Algorithm Initialize λ , ρ . do i=1, ITERS1 do j=1, ITERS2 Solve ∇ L ( u ) = 0 (Newton’s method) λ + := Least Squares Update end do ǫ + = 10 − i Reinitialize λ + end do

  39. Numerical Examples Let Ω = (0 , 1) × (0 , 1) ⊂ R 2 . True image u exact . Component ( u obs ) i given by: ( u obs ) i = ( u exact ) i + rand(1) ∗ C, where C ∼ 10 − 1 . rand.m produces uniformly distributed random numbers on (0 . 00 , 1 . 00) , (Matlab built-in function).

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend