a parallel forward backward splitting method for
play

A Parallel Forward-backward Splitting Method for Multiterm Composite - PowerPoint PPT Presentation

A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization Maicon Marques Alves Joint work with Samara C. Lima Federal University of Santa Catarina, Florian opolis. DIMACS Workshop on ADMM and Proximal


  1. A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization Maicon Marques Alves Joint work with Samara C. Lima Federal University of Santa Catarina, Florian´ opolis. DIMACS Workshop on ADMM and Proximal Splitting Methods in Optimization DIMACS, June 11–13, 2018 A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  2. Forward-backward splitting In a real Hilbert space H , min x ∈H { f ( x ) + ϕ ( x ) } where ◮ f : H → R is convex and differentiable with a L -Lipschitz continuous gradient: �∇ f ( x ) − ∇ f ( y ) � ≤ L � x − y � ∀ x, y ∈ H . ◮ ϕ : H → R ∪ {∞} is proper, convex and closed with an easily computable proximity operator/resolvent: � � ϕ ( y ) + 1 2 λ � y − x � 2 prox λϕ ( x ) := arg min . y ∈H A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  3. Forward-backward splitting ◮ ϕ = δ C ⇒ prox λϕ = P C . ◮ The forward-backward/proximal gradient method: x + = prox λϕ ( x − λ ∇ f ( x ) ) , λ > 0 . � �� � � �� � forward backward A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  4. The problem In a real Hilbert space H , m � min { f i ( x ) + ϕ i ( x ) } x ∈H i =1 where, for all i = 1 , . . . , m , ◮ f i : H → R is convex and differentiable with a L -Lipschitz continuous gradient. ◮ ϕ i : H → R ∪ {∞} is proper, convex and closed with an easily computable proximity operator/resolvent. A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  5. The problem ◮ Interesting applications (see, e.g., N. He, A. Juditsky and A. Nemirovski, 2015; E. Ryu and W. Yin, 2017). ◮ Under standard regularity conditions, it is equivalent to m � 0 ∈ ( ∇ f i + ∂ϕ i )( x ) . � �� � i =1 =: T i A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  6. The proximal point method ◮ Monotone inclusion problem: find z ∈ H such that 0 ∈ T ( z ) where T : H ⇒ H is maximal monotone. ◮ Rockafellar (1976): ∞ � � z k − ( λ k T + I ) − 1 z k − 1 � ≤ e k , e k < ∞ . k =1 ◮ Rockafellar (1976): (weak) convergence and applications. A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  7. The hybrid proximal extragradient (HPE) method ◮ z + = ( λT + I ) − 1 z ⇐ ⇒ v ∈ T ( z + ) , λv + z + − z = 0 . ◮ ε –enlargements (Burachik-Sagastiz´ abal-Svaiter): ∀ v ′ ∈ T ( z ′ ) } . T ε ( z ) := { v ∈ H | � z − z ′ , v − v ′ � ≥ − ε ◮ T ( z ) ⊂ T ε ( z ) . ◮ ∂ ε f ( z ) ⊂ ( ∂f ) ε ( z ) . A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  8. The hybrid proximal extragradient (HPE) method Problem: 0 ∈ T ( z ) . Hybrid proximal extragradient (HPE) method (Solodov and Svaiter, 1999): (0) Let z 0 ∈ H , σ ∈ [0 , 1) and set k = 1 . (1) Find (˜ z k , v k , ε k ) ∈ H × H × R + and λ k > 0 such that v k ∈ T ε k (˜ z k ) , z k − z k − 1 � 2 + 2 λ k ε k ≤ σ 2 � ˜ z k − z k − 1 � 2 . � λ k v k + ˜ (2) Define z k = z k − 1 − λ k v k , set k ← k + 1 and go to Step 1. End z k = ( λ k T + I ) − 1 z k − 1 . ◮ σ = 0 ⇒ A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  9. The hybrid proximal extragradient (HPE) method ◮ Some special instances: Forward-backward, Tseng’s modified forward-backward, Korpolevich, ADMM. ◮ Termination criterion (Monteiro-Svaiter, 2010): Given a precision ρ > 0 , find ( z, v ) and ε ≥ 0 such that v ∈ T ε ( z ) , max {� v � , ε } ≤ ρ. ◮ Monteiro and Svaiter (2010): Iteration-complexity; global √ O (1 / k ) pointwise and O (1 /k ) ergodic convergence rates. ◮ A., Monteiro and Svaiter (2016): Regularized HPE method with O ( ρ − 1 log( ρ − 1 )) pointwise iteration-complexity. A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  10. The hybrid proximal extragradient (HPE) method Theorem (Monteiro and Svaiter, 2010) (a) For any k ≥ 1 , there exists i ∈ { 1 , . . . , k } such that � σ 2 d 2 d 0 1 + σ v i ∈ T ε i (˜ 0 z i ) , � v i � ≤ √ 1 − σ, ε i ≤ 2(1 − σ 2 ) λk . λ k (b) For any k ≥ 1 , √ 1 − σ 2 ) d 2 k � ≤ 2 d 0 k ≤ 2(1 + σ/ k ∈ T ε a v a z a � v a ε a 0 k (˜ k ) , λk , . λk A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  11. The partial inverse method of Spingarn ◮ Spingarn (1983): find x, y ∈ H such that y ∈ V ⊥ and y ∈ T ( x ) x ∈ V, where V is a closed subspace of H and T : H ⇒ H is maximal monotone. ◮ V = H ⇒ 0 ∈ T ( x ) . A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  12. The partial inverse method of Spingarn ◮ Spingarn (1983): m � 0 ∈ T i ( x ) . i =1 ◮ It is equivalent to m � y i ∈ T i ( x i ) , y i = 0 , x 1 = x 2 = · · · = x m . i =1 ◮ In this case, V = { ( x 1 , x 2 , . . . , x m ) ∈ H m | x 1 = x 2 = · · · = x m } , V ⊥ = { ( y 1 , y 2 , . . . , y m ) ∈ H m | y 1 + y 2 + · · · + y m = 0 } , T : H m ⇒ H m , ( x 1 , x 2 , . . . , x m ) �→ T 1 ( x 1 ) × T 2 ( x 2 ) × T m ( x m ) . A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  13. The partial inverse method of Spingarn y ∈ V ⊥ , Problem: x ∈ V, y ∈ T ( x ) . Partial inverse method: (0) Let x 0 ∈ V , y 0 ∈ V ⊥ and set k = 1 . (1) Find ˜ x k , ˜ y k ∈ H and λ k > 0 such that � � 1 x k ) + 1 P V (˜ y k ) + P V ⊥ (˜ y k ) ∈ T P V (˜ P V ⊥ (˜ x k ) , λ k λ k y k + ˜ ˜ x k = x k − 1 + y k − 1 . (2) Define x k = P V (˜ x k ) , y k = P V ⊥ (˜ y k ) , set k ← k + 1 and go to Step 1. End ◮ λ k = 1 x k = ( T + I ) − 1 ( x k − 1 + y k − 1 ) . ⇒ ˜ A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  14. The partial inverse method of Spingarn ◮ Spingarn (1983): The partial inverse of T w.r.t. V is defined as T V : H ⇒ H , � � v ∈ T V ( z ) ⇐ ⇒ P V ( v ) + P V ⊥ ( z ) ∈ T P V ( z ) + P V ⊥ ( v ) . ◮ In particular, 0 ∈ T V ( z ) ⇐ ⇒ P V ⊥ ( z ) ∈ T ( P V ( z ) ) � �� � � �� � y x and = ( λ k T V + I ) − 1 ( x k − 1 + y k − 1 x k + y k ) . � �� � � �� � z k z k − 1 A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  15. Spingarn’s operator splitting Problem: 0 ∈ � m i =1 T i ( x ) . Spingarn’s operator splitting: (0) Let ( x 0 , y 1 , 0 , . . . , y m, 0 ) ∈ H m +1 be such that y 1 , 0 + · · · + y m, 0 = 0 and set k = 1 . (1) For each i = 1 , . . . , m , find ˜ x i, k , ˜ y i, k ∈ H such that y i, k ∈ T i (˜ ˜ x i, k ) , y i, k + ˜ ˜ x i, k = x k − 1 + y i,k − 1 . (2) Define m x k = 1 � ˜ x i, k , y i, k = y i, k − 1 + x k − ˜ x i, k for i = 1 , . . . , m, m i =1 set k ← k + 1 and go to step 1. A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  16. SPDG (Mahey, Oualibouch and Tao, 1995) y ∈ V ⊥ , Problem: x ∈ V, y ∈ T ( x ) . (0) Let x 0 ∈ V , y 0 ∈ V ⊥ , γ > 0 be given and set k = 1 . (1) Find ˜ x k , ˜ y k ∈ H such that y k ∈ T (˜ ˜ x k ) , γ ˜ y k + ˜ x k = x k − 1 + γy k − 1 . (2) Define x k = P V (˜ x k ) , y k = P V ⊥ (˜ y k ) , set k ← k + 1 and go to step 1. End ◮ Fixed point theory: ( x k , γy k ) = J (( x k − 1 , γy k − 1 )) , J firmly nonexp. A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  17. SPDG (Mahey, Oualibouch and Tao, 1995) ◮ A. and Lima (2017): x k + γy k = (( γT ) V + I ) − 1 ( x k − 1 + γy k − 1 ) and T strongly monotone and Lipschitz ⇒ T V strongly monotone . ◮ Rockafellar (2017): Progressive decoupling algorithm for monotone variational inequalities. A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

  18. Relative-error inexact Spingarn’s operator splitting (A. and Lima, 2017) Problem: 0 ∈ T 1 ( x ) + T 2 ( x ) + · · · + T m ( x ) . (0) Let ( x 0 , y 1 , 0 , . . . , y m, 0 ) ∈ H m +1 be such that y 1 , 0 + · · · + y m, 0 = 0 and σ ∈ [0 , 1[ be given and set k = 1 . (1) For each i = 1 , . . . , m , find ˜ x i, k , ˜ y i, k ∈ H and ε i, k ≥ 0 such that y i, k ∈ T ε i, k ˜ (˜ x i, k ) , γ ˜ y i, k + ˜ x i, k = x k − 1 + γy i,k − 1 , i 2 γε i, k ≤ σ 2 � ˜ x i, k − x k − 1 � 2 . (2) Define m x k = 1 y i, k = y i, k − 1 + 1 � ˜ x i, k , γ ( x k − ˜ x i, k ) for i = 1 , . . . , m, m i =1 set k ← k + 1 and go to step 1. A Parallel Forward-backward Splitting Method for Multiterm Composite Convex Optimization

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend