Optimization (Introduction)
Optimization (Introduction) : IR IR f ( x ) ID Optimization " - - PowerPoint PPT Presentation
Optimization (Introduction) : IR IR f ( x ) ID Optimization " - - PowerPoint PPT Presentation
Optimization (Introduction) : IR IR f ( x ) ID Optimization " 112 FCI ) : IR NE Goal: Find the minimizer ! that minimizes the objective (cost) function " ! : " - fatal * Unconstrained Optimization x :
Optimization
Unconstrained Optimization ID f(x) : IR → IR NE FCI) : IR " → 112- fatal
/H¥nfGY
- r
Optimization
Constrained Optimization f- (x*) = min f Cx) X s .t . hi G) ⇐ o →equality gig:# so
→ inequality(
in
Unconstrained Optimization
- What if we are looking for a maximizer !∗?
f(x*)
= main ( -fan )Calculus problem: maximize the rectangle area subject to perimeter constraint
max ! ∈ ℛ!✓
area- Max Area
- perimeter
di.dz#Lwithoutpericoust)-s/di--dz=l0-sA=IooT
- *)./*0*) = 2('$ + '%)
[
mata -100perimeter
t.a.pe
(violates"
- f#¥Lr
D
D
¥
"
pig
¥ .dz) :÷÷÷ILi€¥⇐
:*
:i÷÷÷
.* i;
(d, .dz) d' diesWhat is the optimal solution? (1D)
(First-order) Necessary condition (Second-order) Sufficient condition ! "∗ = min " ! "IT
,
max- gifted, points
f
" txt) > o → x* is minimum f- "(x*) co - X* is maximumDoes the solution exists? Local or global solution?
"°*^#
aIEEE
Example (1D)
- 6
- 4
- 2
- 200
- 100
- 4¥
- 3¥
- 22×+40
- 25
'
''as.ae#-aftIfa7::ii:i:.::.oIEhin,
f- "f-5) =3 ( 25) +10-2220 (Min)Types of optimization problems
Gradient-free methods Gradient (first-derivative) methods Evaluate ! " , !′ " , !′′ " Second-derivative methods ! "∗ = min " ! " Evaluate ! " Evaluate ! " , !′ " 5: nonlinear, continuous and smoothIf
- ff
Optimization in 1D:
Golden Section Search
- Similar idea of bisection method for root finding
- Needs to bracket the minimum inside an interval
- Required the function to be unimodal
- : < !∗ ⟹ "(-E) > "(-:)
- E > !∗ ⟹ "(-E) < "(-:)
I
::¥:*"i÷
/ / 4 Ifix.
✓ *OI
- r
- r
¥7
(FIT tafa
Fb Fb- B.
*a
*2×* BIfif@HsxIIfIf1xisxIXttE-La.XzTfI.t
[ 4 , b ]
K£7
Propose the point asks- t.
IT
x , e- at Cl -E) hk / XI XzHI
, Xz- - at Chic
Ill
- C) hk
1←Ehk'#£E)hI
f , > f,- r
!¥%,
9k
Ex, , b ] [a. xDT.ee/-fEoery
iterationaFI¥EIb
* hkti-E.hn
interval gets 'Em!i hatch
..ch#..EfET;db" I '(I - E) hrs
- Ehr
¥t%h* ,
Cte)
- E - 12=0.6-187
t
KI
interval Caio) 12-0.618-1 h . .- Cb - a)
- !
I
,hofz-t-GD-qh.pe#qs.--cyh-/
if f , sfz :- →x*E[a,Xz]
9k
b=X2µh-#
xz=x, → fz -fi a:④TTbtf+ ,III.Ethan..
t¥h¥¥/(
f, - FCK) if f , > fz : → x*E Ex, , b]¥+,¥¥¥hµ
,That
Ej÷¥÷nn-f=fz
- Xz= At Chu
Golden Section Search
Golden Section Search
What happens with the length of the interval after one iteration? ℎ! = ( ℎ" Or in general: ℎ#$! = ( ℎ# Hence the interval gets reduced by ) (for bisection method to solve nonlinear equations, (=0.5) For recursion: ( ℎ! = (1 − () ℎ" ( ( ℎ" = (1 − () ℎ" (% = (1 − () ) = .. 012- Derivative free method!
- Slow convergence:
- Only one function evaluation per iteration
Golden Section Search II
→ the < tot① I
evaluate FG) HE ha- ¥
- hee
ef÷=Y÷
- Chien
- re I
- Xi
Example
A = - to → ho = 20TT
b- to
h ,
= ? he , = I ko → 0.618×20 = 12.36Newton’s Method
Using Taylor Expansion, we can approximate the function " with a quadratic function about -M " - ≈ " -M + "N -M (- − -M) + E : "N′ -M (- − -M): And we want to find the minimum of the quadratic function using the first-order necessary conditionYeti = Xk t h
n¥near"
= I* = O
→point
5- ' Go) tyzf '' Go) Cx- to)¢=of
stationary
sin:÷±÷÷÷µ÷¥¥÷7¥
Newton’s Method
- Algorithm:
- Convergence:
- Typical quadratic convergence
- Local convergence (start guess close to solution)
- May fail to converge, or converge to a maximum or
i
- - -
Newton’s Method (Graphical Representation)
get A tho) HAIED
'• Iµ.¥i
l l l#i→
X3 Xz Xl Xo X sequence of opt. usingquad
. approx IExample
Consider the function " - = 4 -9 + 2 -: + 5 - + 40 If we use the initial guess -M = 2, what would be the value of - after one iteration of the Newton’s method?- x , = ?
h
=- fifty
- CMg¥£¥t#
- ft
- ¥2 -1×1--0.82697