icml 2019 in long beach
play

ICML 2019 in Long Beach Model Function Based Conditional Gradient - PowerPoint PPT Presentation

ICML 2019 in Long Beach Model Function Based Conditional Gradient Method with Armijo-like Line Search Peter Ochs Mathematical Optimization Group Saarland University 13.06.2019 joint work: Yura Malitsky 1 / 7 c 2019 Peter


  1. ICML 2019 in Long Beach Model Function Based Conditional Gradient Method with Armijo-like Line Search Peter Ochs Mathematical Optimization Group Saarland University — 13.06.2019 — joint work: Yura Malitsky 1 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

  2. Classic Conditional Gradient Method Constrained Smooth Optimization Problem: min x ∈ C f ( x ) ◮ C ⊂ R N compact and convex constraint set Conditional Gradient Method: Update step: y ( k ) ∈ argmin ∇ f ( x ( k ) ) , y � � y ∈ C x ( k +1) = γ k y ( k ) + (1 − γ k ) x ( k ) Convergence mainly relies on: ◮ step size γ k ∈ [0 , 1] ( we consider Armijo line search ) ◮ Descent Lemma (implies curvature condition) 2 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

  3. Generalizing the Descent Lemma Descent Lemma: x � 2 x � | ≤ L | f ( x ) − f (¯ x ) − �∇ f (¯ x ) , x − ¯ 2 � x − ¯ provides a measure for the linearization error � quadratic growth ◮ f smooth non-convex ◮ L is the Lipschitz constant of ∇ f 3 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

  4. Generalizing the Descent Lemma Generalization of the Descent Lemma: | f ( x ) − f (¯ x ) − �∇ f (¯ x ) , x − ¯ x � | ≤ ω ( � x − ¯ x � ) provides a measure for the linearization error � growth given by ω ◮ f smooth non-convex ◮ ω : R + → R + is a growth function 3 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

  5. Generalizing the Descent Lemma Generalization of the Descent Lemma: | f ( x ) − f ¯ x ( x ) | ≤ ω ( � x − ¯ x � ) provides a measure for the approximation error � growth given by ω ◮ f non-smooth non-convex ◮ ω : R + → R + is a growth function 3 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

  6. Model Assumption | f ( x ) − f ¯ x ( x ) | ≤ ω ( � x − ¯ x � ) f ( x ) + ω ( � x − ¯ x � ) f ( x ) ¯ x f ( x ) − ω ( � x − ¯ x � ) 4 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

  7. Model Function based Conditional Gradient Method Model Function based Conditional Gradient Method : y ( k ) ∈ argmin f x ( k ) ( y ) y ∈ C x ( k +1) = γ k y ( k ) + (1 − γ k ) x ( k ) Examples for Model Assumption: | f ( x ) − f ¯ x ( x ) | ≤ ω ( � x − ¯ x � ) ◮ additive composite problem: min x ∈ C { f ( x ) = g ( x ) + h ( x ) } non-smooth smooth ◮ model function: f ¯ x ( x ) = g ( x ) + h (¯ x ) + �∇ h (¯ x ) , x − ¯ x � ◮ oracle: argmin � ∇ h ( x ( k ) ) , y � g ( y ) + y ∈ C 5 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

  8. Examples Examples for Model Assumption: | f ( x ) − f ¯ x ( x ) | ≤ ω ( � x − ¯ x � ) ◮ hybrid Proximal–Conditional Gradient , example: min { f ( x 1 , x 2 ) = g ( x 1 ) + h ( x 2 ) } x 1 ∈ C 1 non-smooth smooth x 2 ∈ C 2 x 2 � + g ( x 1 ) + 1 x 1 � 2 ◮ f ¯ x ( x 1 , x 2 ) = h (¯ x 2 ) + �∇ h (¯ x 2 ) , x 2 − ¯ 2 λ � x 1 − ¯  2 λ � y 1 − x ( k ) g ( y 1 ) + 1 1 � 2 argmin   y 1 ∈ C 1 ◮ oracle: � � ∇ h ( x ( k ) argmin 2 ) , y 2   y 2 ∈ C 2 6 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

  9. Examples ◮ composite problem ◮ second order Conditional Gradient Design model functions for your problem! 7 / 7 c � 2019 — Peter Ochs Model Based Conditional Gradient

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend