deep unfolding of a proximal interior point method for
play

Deep Unfolding of a Proximal Interior Point Method for Image - PowerPoint PPT Presentation

Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Deep Unfolding of a Proximal Interior Point Method for Image Restoration M.-C. Corbineau 1 in


  1. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Deep Unfolding of a Proximal Interior Point Method for Image Restoration M.-C. Corbineau 1 in collaboration with C. Bertocchi 2 , E. Chouzenoux 1 , J.C. Pesquet 1 , M. Prato 2 1Université Paris-Saclay, CentraleSupélec, Inria, Centre de Vision Numérique, Gif-sur-Yvette, France 2Università di Modena e Reggio Emilia, Modena, Italy 19 November 2019 Workshop on Regularisation for Inverse Problems and Machine Learning Jussieu, Paris Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 1 / 35

  2. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Motivation Inverse problem in imaging y = D ( Hx ) where y ∈ R m observed image, D degradation model, H ∈ R m × n linear observation model, x ∈ R n original image Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 2 / 35

  3. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Motivation Inverse problem in imaging y = D ( Hx ) where y ∈ R m observed image, D degradation model, H ∈ R m × n linear observation model, x ∈ R n original image Variational methods minimize f ( Hx , y ) + λ R ( x ) x ∈C where f : R m × R m → R data-fitting term, R : R n → R regularization function, λ > 0 regularization weight ✓ Incorporate prior knowledge about solution and enforce desirable constraints ✗ No closed-form solution → advanced algorithms ✗ Estimation of λ and tuning of algorithm parameters → time-consuming Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 2 / 35

  4. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Motivation Inverse problem in imaging y = D ( Hx ) where y ∈ R m observed image, D degradation model, H ∈ R m × n linear observation model, x ∈ R n original image Variational methods minimize f ( Hx , y ) + λ R ( x ) x ∈C where f : R m × R m → R data-fitting term, R : R n → R regularization function, λ > 0 regularization weight ✓ Incorporate prior knowledge about solution and enforce desirable constraints ✗ No closed-form solution → advanced algorithms ✗ Estimation of λ and tuning of algorithm parameters → time-consuming Deep-learning methods ✓ Generic and very efficient architectures ✗ Pre-processing step : solve optimization problem → estimate regularization parameter ✗ Black-box, no theoretical guarantees Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 2 / 35

  5. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Motivation Inverse problem in imaging y = D ( Hx ) where y ∈ R m observed image, D degradation model, H ∈ R m × n linear observation model, x ∈ R n original image Variational methods minimize f ( Hx , y ) + λ R ( x ) x ∈C where f : R m × R m → R data-fitting term, R : R n → R regularization function, λ > 0 regularization weight ✓ Incorporate prior knowledge about solution and enforce desirable constraints ✗ No closed-form solution → advanced algorithms ✗ Estimation of λ and tuning of algorithm parameters → time-consuming Deep-learning methods ✓ Generic and very efficient architectures ✗ Pre-processing step : solve optimization problem → estimate regularization parameter ✗ Black-box, no theoretical guarantees → Combine benefits of both approaches : unfold proximal interior point algorithm Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 2 / 35

  6. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Deep Unfolding • Examples • Sparse coding : FISTA [Gregor and LeCun, 2010] , ISTA [Kamilov and Mansour, 2016] • Compressive sensing : ISTA [Zhang and Ghanem, 2018] , ADMM [Sun et al., 2016] • Principle Iterative solver Unfolded algorithm for k = 0 , 1 , . . . for k = 0 , 1 , . . . , K − 1 � � x k , L ( θ ) x k +1 = A ( x k , θ k ) = ⇒ x k +1 = A k ( x k ) ↓ ↓ hyperparameters layer estimating hyperparameters Estimate : x ∗ = lim Estimate : x ∗ = x K k →∞ x k • Operators and functions included in A can be learned ✓ Gradient backpropagation and training are simpler ✗ Link to the original algorithm is weakened Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 3 / 35

  7. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Notation and Assumptions Proximity operator Let Γ 0 ( R n ) be the set of proper lsc convex functions from R n to R ∪ { + ∞} . The proximal operator [ http://proximity-operator.net/ ] of g ∈ Γ 0 ( R n ) at x ∈ R n is uniquely defined as � 2 � z − x � 2 � g ( z ) + 1 prox g ( x ) = argmin . z ∈ R n Assumptions P 0 : minimize f ( Hx , y ) + λ R ( x ) x ∈C We assume that f ( · , y ) and R are twice-differentiable, f ( H · , y ) + λ R ∈ Γ 0 ( R n ) is either coercive or C is bounded. The feasible set is defined as C = { x ∈ R n | ( ∀ i ∈ { 1 , . . . , p } ) c i ( x ) ≥ 0 } where ( ∀ i ∈ { 1 , . . . , p } ), − c i ∈ Γ 0 ( R n ). The strict interior of the feasible set is nonempty. Existence of a solution to P 0 Twice-differentiability : training using gradient descent B : logarithmic barrier − � p � i =1 ln( c i ( x )) if x ∈ int C ( ∀ x ∈ R n ) B ( x ) = + ∞ otherwise . Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 4 / 35

  8. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Logarithmic barrier method Constrained Problem P 0 : minimize f ( Hx , y ) + λ R ( x ) x ∈C Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 5 / 35

  9. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Logarithmic barrier method Constrained Problem P 0 : minimize f ( Hx , y ) + λ R ( x ) x ∈C ⇓ Unconstrained Subproblem P µ : minimize f ( Hx , y ) + λ R ( x ) + µ B ( x ) x ∈ R n where µ > 0 is the barrier parameter. P 0 is replaced by a sequence of subproblems ( P µ j ) j ∈ N . Subproblems solved approximately for a sequence µ j → 0 Main advantages : feasible iterates, superlinear convergence for NLP ✗ Inversion of an n × n matrix at each step Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 5 / 35

  10. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Proximal interior point strategy → Combine interior point method with proximity operator Exact version of the proximal IPM in [Kaplan and Tichatschke, 1998] . Let x 0 ∈ int C , γ > 0, ( ∀ k ∈ N ) γ ≤ γ k and µ k → 0 ; for k = 0 , 1 , . . . do x k +1 = prox γ k ( f ( H · , y )+ λ R + µ k B ) ( x k ) end for ✗ No closed-form solution for prox γ k ( f ( H · , y )+ λ R + µ k B ) Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 6 / 35

  11. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Proximal interior point strategy → Combine interior point method with proximity operator Exact version of the proximal IPM in [Kaplan and Tichatschke, 1998] . Let x 0 ∈ int C , γ > 0, ( ∀ k ∈ N ) γ ≤ γ k and µ k → 0 ; for k = 0 , 1 , . . . do x k +1 = prox γ k ( f ( H · , y )+ λ R + µ k B ) ( x k ) end for ✗ No closed-form solution for prox γ k ( f ( H · , y )+ λ R + µ k B ) Proposed forward–backward proximal IPM. Let x 0 ∈ int C , γ > 0, ( ∀ k ∈ N ) γ ≤ γ k and µ k → 0 ; for k = 0 , 1 , . . . do � � H ⊤ ∇ 1 f ( Hx k , y ) + λ ∇R ( x k ) �� x k +1 = prox γ k µ k B x k − γ k end for ✓ Only requires prox γ k µ k B Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 6 / 35

  12. Introduction Proximal IP method Proximity operator of the barrier Proposed architecture Network stability Numerical experiments Conclusion Proximity operator of the barrier Let ϕ : ( x , α ) �→ prox α B ( x ). A neural network obtained by unfolding an iterative solver A • requires to compute A ( x , θ ). → expression for the proximity operator ϕ ( x , α ) ? Corbineau et al. Deep Unfolding of a Proximal Interior Point Method Workshop Jussieu, 2019 7 / 35

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend