a secant method for nonsmooth optimization
play

A Secant Method for Nonsmooth Optimization Asef Nazari CSIRO - PowerPoint PPT Presentation

A Secant Method for Nonsmooth Optimization Asef Nazari CSIRO Melbourne CARMA Workshop on Optimization, Nonlinear Analysis, Randomness and Risk Newcastle, Australia 12 July, 2014 Asef Nazari A Secant Method for Nonsmooth Optimization Outline


  1. A Secant Method for Nonsmooth Optimization Asef Nazari CSIRO Melbourne CARMA Workshop on Optimization, Nonlinear Analysis, Randomness and Risk Newcastle, Australia 12 July, 2014 Asef Nazari A Secant Method for Nonsmooth Optimization

  2. Outline Background Secant Method 1 Background The Problem Optimization Algorithms and components Subgradient and Subdifferential 2 Secant Method Definitions Optimality Condition and Descent Direction The Secant Algorithm Numerical Results Asef Nazari Secant Method

  3. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential The problem Unconstrained Optimization Problem x ∈ R n f ( x ) min f : R n → R n Locally Lipshitz ¨ O Asef Nazari Secant Method

  4. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential The problem Unconstrained Optimization Problem x ∈ R n f ( x ) min f : R n → R n Locally Lipshitz ¨ O Why just unconstrained? Asef Nazari Secant Method

  5. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Transforming Constrained into Unconstrained Constrained Optimization Problem min x ∈ Y f ( x ) where Y ⊂ R n . Asef Nazari Secant Method

  6. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Transforming Constrained into Unconstrained Constrained Optimization Problem min x ∈ Y f ( x ) where Y ⊂ R n . Distance function dist ( x , Y ) = min y ∈ Y � y − x � Asef Nazari Secant Method

  7. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Transforming Constrained into Unconstrained Constrained Optimization Problem min x ∈ Y f ( x ) where Y ⊂ R n . Distance function dist ( x , Y ) = min y ∈ Y � y − x � Theory of penalty function (under some conditions) x ∈ R n f ( x ) + σ dist ( x , y ) min f ( x ) + σ dist ( x , y ) is a nonsmooth function Asef Nazari Secant Method

  8. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Sources of Nonsmooth Problems Minimax Problem x ∈ R n f ( x ) min f ( x ) = max 1 ≤ i ≤ m f i ( x ) Asef Nazari Secant Method

  9. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Sources of Nonsmooth Problems Minimax Problem x ∈ R n f ( x ) min f ( x ) = max 1 ≤ i ≤ m f i ( x ) System of Nonlinear Equations f i ( x ) = 0 , i = 1 , . . . , m , Asef Nazari Secant Method

  10. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Sources of Nonsmooth Problems Minimax Problem x ∈ R n f ( x ) min f ( x ) = max 1 ≤ i ≤ m f i ( x ) System of Nonlinear Equations f i ( x ) = 0 , i = 1 , . . . , m , we often do x ∈ R n � ¯ min f ( x ) � where ¯ f ( x ) = ( f 1 ( x ) , . . . , f m ( x )) Asef Nazari Secant Method

  11. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Structure of Optimization Algorithms Step 1 Initial Step x 0 ∈ R n Step 2 Termination Criteria Step 3 Finding descent direction d k at x k Step 4 Finding step size f ( x k + α k d k ) < f ( x k ) Step 5 Loop x k +1 = x k + α k d k and go to step 2. Asef Nazari Secant Method

  12. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Classification of Algorithms Based on d k and α k Directions d k = −∇ f ( x k ) Steepest Descent Method d k = − H − 1 ( x k ) ∇ f ( x k ) Newton Method d k = − B − 1 ∇ f ( x k ) Quasi-Newton Method Asef Nazari Secant Method

  13. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Classification of Algorithms Based on d k and α k Directions d k = −∇ f ( x k ) Steepest Descent Method d k = − H − 1 ( x k ) ∇ f ( x k ) Newton Method d k = − B − 1 ∇ f ( x k ) Quasi-Newton Method Step sizes h ( α ) = f ( x k + α d k ) exactly solve h ′ ( α ) = 0 exact line search 1 loosly solve it, inexact line serach 2 Trust region methods Asef Nazari Secant Method

  14. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential When the objective function is smooth Descent direction f ′ ( x k , d ) < 0 f ′ ( x k , d ) = �∇ f ( x k ) , d � ≤ 0 descent direction −∇ f ( x k ) steepest descent direction �∇ f ( x k ) � ≤ ε good stopping criteria (First Order Necessary Conditions) Asef Nazari Secant Method

  15. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subgradient Asef Nazari Secant Method

  16. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subgradient Basic Inequality for convex differentiable function: f ( x ) ≥ f ( y ) + ∇ f ( y ) T . ( x − y ) ∀ x ∈ Dom ( f ) Asef Nazari Secant Method

  17. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subgradient g ∈ R n is a subgradient of a convex function f at y if f ( x ) ≥ f ( y ) + g T . ( x − y ) ∀ x ∈ Dom ( f ) Asef Nazari Secant Method

  18. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subgradient g ∈ R n is a subgradient of a convex function f at y if f ( x ) ≥ f ( y ) + g T . ( x − y ) ∀ x ∈ Dom ( f ) Asef Nazari Secant Method

  19. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subgradient g ∈ R n is a subgradient of a convex function f at y if f ( x ) ≥ f ( y ) + g T . ( x − y ) ∀ x ∈ Dom ( f ) if x < 0, g = − 1 Asef Nazari Secant Method

  20. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subgradient g ∈ R n is a subgradient of a convex function f at y if f ( x ) ≥ f ( y ) + g T . ( x − y ) ∀ x ∈ Dom ( f ) if x < 0, g = − 1 if x > 0, g = 1 Asef Nazari Secant Method

  21. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subgradient g ∈ R n is a subgradient of a convex function f at y if f ( x ) ≥ f ( y ) + g T . ( x − y ) ∀ x ∈ Dom ( f ) if x < 0, g = − 1 if x > 0, g = 1 if x = 0, g ∈ [ − 1 , 1] Asef Nazari Secant Method

  22. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subdifferential Set of all subgradients of f at x , ∂ f ( x ), is called subdifferential. Asef Nazari Secant Method

  23. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subdifferential Set of all subgradients of f at x , ∂ f ( x ), is called subdifferential. for f ( x ) = | x | , the subdifferential at x = 0 is ∂ f (0) = [ − 1 , 1] For x > 0 ∂ f ( x ) = { 1 } For x < 0 ∂ f ( x ) = {− 1 } Asef Nazari Secant Method

  24. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subdifferential Set of all subgradients of f at x , ∂ f ( x ), is called subdifferential. for f ( x ) = | x | , the subdifferential at x = 0 is ∂ f (0) = [ − 1 , 1] For x > 0 ∂ f ( x ) = { 1 } For x < 0 ∂ f ( x ) = {− 1 } f is differentiable at x ⇐ ⇒ ∂ f ( x ) = {∇ f ( x ) } Asef Nazari Secant Method

  25. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Subdifferential Set of all subgradients of f at x , ∂ f ( x ), is called subdifferential. for f ( x ) = | x | , the subdifferential at x = 0 is ∂ f (0) = [ − 1 , 1] For x > 0 ∂ f ( x ) = { 1 } For x < 0 ∂ f ( x ) = {− 1 } f is differentiable at x ⇐ ⇒ ∂ f ( x ) = {∇ f ( x ) } for Lipschitz functions (Clarke) ∂ f ( x 0 ) = conv { lim ∇ f ( x i ) : x i − → x 0 ∇ f ( x i ) exists } Asef Nazari Secant Method

  26. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Optimality Condition For a smooth convex function f ( x ∗ ) = ⇒ 0 = ∇ f ( x ∗ ) x ∈ Dom ( f ) f ( x ) ⇐ inf Asef Nazari Secant Method

  27. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Optimality Condition For a smooth convex function f ( x ∗ ) = ⇒ 0 = ∇ f ( x ∗ ) x ∈ Dom ( f ) f ( x ) ⇐ inf For a nonsmooth convex function f ( x ∗ ) = ⇒ 0 ∈ ∂ f ( x ∗ ) x ∈ Dom ( f ) f ( x ) ⇐ inf Asef Nazari Secant Method

  28. Outline The Problem Background Optimization Algorithms and components Secant Method Subgradient and Subdifferential Directional derivative In Smooth Case f ( x k + λ d ) − f ( x k ) f ′ ( x k , d ) = lim = �∇ f ( x k ) , d � λ λ → 0 + Asef Nazari Secant Method

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend