Optimization (Introduction) Optimization Goal: Find the minimizer - - PowerPoint PPT Presentation

β–Ά
optimization introduction optimization
SMART_READER_LITE
LIVE PREVIEW

Optimization (Introduction) Optimization Goal: Find the minimizer - - PowerPoint PPT Presentation

Optimization (Introduction) Optimization Goal: Find the minimizer that minimizes the objective (cost) function : " Unconstrained Optimization Optimization Goal: Find the minimizer that minimizes the


slide-1
SLIDE 1

Optimization (Introduction)

slide-2
SLIDE 2

Goal: Find the minimizer π’šβˆ—that minimizes the objective (cost) function 𝑔 π’š : β„›" β†’ β„›

Optimization

Unconstrained Optimization

slide-3
SLIDE 3

Goal: Find the minimizer π’šβˆ—that minimizes the objective (cost) function 𝑔 π’š : β„›" β†’ β„›

Optimization

Constrained Optimization

slide-4
SLIDE 4

Unconstrained Optimization

  • What if we are looking for a maximizer π’šβˆ—?

𝑔 π’šβˆ— = max

π’š

𝑔 π’š

slide-5
SLIDE 5

Calculus problem: maximize the rectangle area subject to perimeter constraint

max

𝒆 ∈ β„›!

slide-6
SLIDE 6

𝑒$ 𝑒$ 𝑒% 𝑒% 𝐡𝑠𝑓𝑏 = 𝑒$𝑒% 𝑄𝑓𝑠𝑗𝑛𝑓𝑒𝑓𝑠 = 2(𝑒$ + 𝑒%)

slide-7
SLIDE 7
slide-8
SLIDE 8

Unconstrained Optimization 1D

slide-9
SLIDE 9

What is the optimal solution? (1D)

(First-order) Necessary condition (Second-order) Sufficient condition

𝑔 π‘¦βˆ— = min

"

𝑔 𝑦

slide-10
SLIDE 10

Types of optimization problems

Gradient-free methods Gradient (first-derivative) methods

Evaluate 𝑔 𝑦 , 𝑔′ 𝑦 , 𝑔′′ 𝑦

Second-derivative methods

𝑔 π‘¦βˆ— = min

"

𝑔 𝑦

Evaluate 𝑔 𝑦 Evaluate 𝑔 𝑦 , 𝑔′ 𝑦

𝑔: nonlinear, continuous and smooth

slide-11
SLIDE 11

Does the solution exists? Local or global solution?

slide-12
SLIDE 12

Example (1D)

  • 6
  • 4
  • 2

2 4 6

  • 200
  • 100

100

Consider the function 𝑔 π’š = 7!

8 βˆ’ 7" 9 βˆ’ 11 𝑦: + 40𝑦. Find the stationary

point and check the sufficient condition

slide-13
SLIDE 13

Optimization in 1D:

Golden Section Search

  • Similar idea of bisection method for root finding
  • Needs to bracket the minimum inside an interval
  • Required the function to be unimodal

A function 𝑔: β„› β†’ β„› is unimodal on an interval [𝑏, 𝑐] ΓΌ There is a unique π’šβˆ— ∈ [𝑏, 𝑐] such that 𝑔(π’šβˆ—) is the minimum in [𝑏, 𝑐] ΓΌ For any 𝑦;, 𝑦: ∈ [𝑏, 𝑐] with 𝑦; < 𝑦: Β§ 𝑦: < π’šβˆ— ⟹ 𝑔(𝑦;) > 𝑔(𝑦:) Β§ 𝑦; > π’šβˆ— ⟹ 𝑔(𝑦;) < 𝑔(𝑦:)

slide-14
SLIDE 14

𝑏 𝑐 𝑑 𝑦$ 𝑦% 𝑦& 𝑦' 𝑔

$

𝑔

%

𝑔

&

𝑔

'

𝑏 𝑐 𝑑 𝑦$ 𝑦% 𝑦& 𝑦' 𝑔

$

𝑔

%

𝑔

&

𝑔

'

slide-15
SLIDE 15
slide-16
SLIDE 16
slide-17
SLIDE 17

Golden Section Search

slide-18
SLIDE 18

Golden Section Search

What happens with the length of the interval after one iteration? β„Ž! = 𝜐 β„Ž" Or in general: β„Ž#$! = 𝜐 β„Ž# Hence the interval gets reduced by 𝝊 (for bisection method to solve nonlinear equations, 𝜐=0.5) For recursion: 𝜐 β„Ž! = (1 βˆ’ 𝜐) β„Ž" 𝜐 𝜐 β„Ž" = (1 βˆ’ 𝜐) β„Ž" 𝜐% = (1 βˆ’ 𝜐) 𝝊 = 𝟏. πŸ•πŸπŸ—

slide-19
SLIDE 19
  • Derivative free method!
  • Slow convergence:

lim

?β†’A

|𝑓?B;| 𝑓? = 0.618 𝑠 = 1 (π‘šπ‘—π‘œπ‘“π‘π‘  π‘‘π‘π‘œπ‘€π‘“π‘ π‘•π‘“π‘œπ‘‘π‘“)

  • Only one function evaluation per iteration

Golden Section Search

slide-20
SLIDE 20

Example

slide-21
SLIDE 21

Newton’s Method

Using Taylor Expansion, we can approximate the function 𝑔 with a quadratic function about 𝑦C 𝑔 𝑦 β‰ˆ 𝑔 𝑦C + 𝑔D 𝑦C (𝑦 βˆ’ 𝑦C) + ;

: 𝑔Dβ€² 𝑦C (𝑦 βˆ’ 𝑦C):

And we want to find the minimum of the quadratic function using the first-order necessary condition

slide-22
SLIDE 22

Newton’s Method

  • Algorithm:

𝑦0 = starting guess 𝑦123 = 𝑦1 βˆ’ 𝑔′ 𝑦1 /𝑔′′ 𝑦1

  • Convergence:
  • Typical quadratic convergence
  • Local convergence (start guess close to solution)
  • May fail to converge, or converge to a maximum or

point of inflection

slide-23
SLIDE 23

Newton’s Method (Graphical Representation)

slide-24
SLIDE 24

Example

Consider the function 𝑔 𝑦 = 4 𝑦9 + 2 𝑦: + 5 𝑦 + 40 If we use the initial guess 𝑦C = 2, what would be the value of 𝑦 after one iteration of the Newton’s method?