Convex Optimization — Boyd & Vandenberghe
- 10. Unconstrained minimization
- terminology and assumptions
- gradient descent method
- steepest descent method
- Newton’s method
- self-concordant functions
- implementation
10–1
10. Unconstrained minimization terminology and assumptions gradient - - PowerPoint PPT Presentation
Convex Optimization Boyd & Vandenberghe 10. Unconstrained minimization terminology and assumptions gradient descent method steepest descent method Newtons method self-concordant functions implementation 101
Convex Optimization — Boyd & Vandenberghe
10–1
Unconstrained minimization 10–2
m
i x + bi)),
m
i x)
Unconstrained minimization 10–3
2
2
Unconstrained minimization 10–4
Unconstrained minimization 10–5
Unconstrained minimization 10–6
Unconstrained minimization 10–7
1 + γx2 2)
1
2
Unconstrained minimization 10–8
Unconstrained minimization 10–9
500
i x)
50 100 150 200 10−4 10−2 100 102 104
Unconstrained minimization 10–10
∗
Unconstrained minimization 10–11
++): ∆xsd = −P −1∇f(x)
Unconstrained minimization 10–12
Unconstrained minimization 10–13
Unconstrained minimization 10–14
Unconstrained minimization 10–15
y
nt∇2f(x)∆xnt
Unconstrained minimization 10–16
Unconstrained minimization 10–17
Unconstrained minimization 10–18
Unconstrained minimization 10–19
Unconstrained minimization 10–20
1 2 3 4 5 10−15 10−10 10−5 100 105
Unconstrained minimization 10–21
2 4 6 8 10 10−15 10−10 10−5 100 105
2 4 6 8 0.5 1 1.5 2
Unconstrained minimization 10–22
10000
i) − 100000
i x)
5 10 15 20 10−5 100 105
Unconstrained minimization 10–23
Unconstrained minimization 10–24
Unconstrained minimization 10–25
i=1 log(bi − aT i x) on {x | aT i x < bi, i = 1, . . . , m}
++
Unconstrained minimization 10–26
Unconstrained minimization 10–27
i=1 log(bi − aT i x)
5 10 15 20 25 30 35 5 10 15 20 25
Unconstrained minimization 10–28
Unconstrained minimization 10–29
n
i (xi); H0 = ∇2ψ0(Ax + b)
0 ; write Newton system as
0 A∆x − w = 0
0 AD−1ATL0)w = −LT 0 AD−1g,
0 AD−1ATL0)
Unconstrained minimization 10–30