MORE CONVEX OPTIMIZATION MORE CONVEX OPTIMIZATION Matthieu R Bloch - - PowerPoint PPT Presentation

more convex optimization more convex optimization
SMART_READER_LITE
LIVE PREVIEW

MORE CONVEX OPTIMIZATION MORE CONVEX OPTIMIZATION Matthieu R Bloch - - PowerPoint PPT Presentation

MORE CONVEX OPTIMIZATION MORE CONVEX OPTIMIZATION Matthieu R Bloch Thursday, February 20, 2020 1 LOGISTICS LOGISTICS TAs and Office hours Tuesday: TJ (VL C449 Cubicle D) - 1:30pm - 2:45pm Wednesday: Matthieu (TSRB 423) - 12:00:pm-1:15pm


slide-1
SLIDE 1

Matthieu R Bloch Thursday, February 20, 2020

MORE CONVEX OPTIMIZATION MORE CONVEX OPTIMIZATION

1

slide-2
SLIDE 2

LOGISTICS LOGISTICS

TAs and Office hours Tuesday: TJ (VL C449 Cubicle D) - 1:30pm - 2:45pm Wednesday: Matthieu (TSRB 423) - 12:00:pm-1:15pm Thursday: Hossein (VL C449 Cubicle B): 10:45pm - 12:00pm Friday: Brighton (TSRB 523a) - 12pm-1:15pm Homework 3 Due Wednesday February 19, 2020 11:59pm EST (Friday February 126, 2020 for DL) Please submit a standalone PDF with your solution, including plots and listings Please also submit your code (we check it randomly) Make sure you show your work, don’t leave gaps in logic

2

slide-3
SLIDE 3

PROJECTS PROJECTS

Form teams of 3-5 and report on the shared spreadsheet Link on canvas Topic is up to you! You can explore papers or play with scikit Not need to invent new things, but I want to see some depth Consult with me if you have ideas. It’s ok to use something related to your research Timeline Teams formed by Feb 23, 2020 Project proposal March 8 2020 Poster presentation April 14 and 16, 2020 Final report April 24, 2020

3

slide-4
SLIDE 4

RECAP: CONSTRAINED OPTIMIZATION RECAP: CONSTRAINED OPTIMIZATION

General form of constrained optimization problem is called the objective function is called an inequality constraint is called an equality constraint if satisfies all the constraints, we say that it is feasible.

  • Definition. (Convex problem)

A constrained optimization problem is convex if is convex, the ’s are convex, and the ’s are affine Turn constrained optimization into unconstrained one using Lagrangian and are dual variables or Lagrange multipliers

f(x) such that { min

x∈Rd

(x) ≤ 0 ∀i ∈ [1; m] gi (x) = 0 ∀j ∈ [1; p] hj f (x) gi (x) hj x f gi hj L(x, λ, μ) ≜ f(x) + (x) + (x) with λ ≥ 0 ∑

i=1 m

λigi ∑

j=1 p

μjhj λ ≜ [ , ⋯ , λ1 λm]⊺ μ ≜ [ , ⋯ , μ1 μp]⊺

4

slide-5
SLIDE 5

RECAP PRIMAL AND DUAL PROBLEMS RECAP PRIMAL AND DUAL PROBLEMS

The Lagrange dual function is The dual optimization problem is . The primal function is The primal optimization problem is Lemma. The (Lagrangian) dual function is concave. Lemma. Let denote the set of feasible values of the original problem. Then .

(λ, μ) = L(x, λ, μ) LD min

x

L(λ, μ) = L(x, λ, μ) max

λ≥0,μ

max

λ≥0,μ min x

(x) ≜ L(x, λ, μ) LP max

λ≥0,μ

(x) = L(x, λ, μ) min

x LP

min

x

max

λ≥0,μ

F (x) = f(x) minx LP minx∈F

5

slide-6
SLIDE 6

6

slide-7
SLIDE 7

WEAK DUALITY WEAK DUALITY

Theorem (Weak duality) We can solve the dual (concave maximization problem) and obtain a lower bound for the primal The gap is called the duality gap Strong duality is the situation in which Question: how are the solutions of the primal and dual problems related?

≜ L(x, λ, μ) ≤ ≜ L(x, λ, μ) d∗ max

λ≥0,μ min x

p∗ min

x

max

λ≥0,μ

− ≥ 0 p∗ d∗ − = 0 p∗ d∗

7

slide-8
SLIDE 8

8

slide-9
SLIDE 9

9

slide-10
SLIDE 10

KARUSH-KUHN TUCKER CONDITIONS KARUSH-KUHN TUCKER CONDITIONS

Assume , , are all differentiable Consider Stationarity: Primal feasibility: Dual feasibility: Complementary slackness:

f { } gi { } hj x, λ, μ 0 = ∇f(x) + ∇ (x) + ∇ (x) ∑

i=1 m

λi gi ∑

j=1 p

μj hj ∀i ∈ [1; m] (x) ≤ 0 ∀j ∈ [1; p] (x) = 0 gi hj ∀i ∈ [1; m] ≥ 0 λi ∀i ∈ [1; m] (x) = 0 λigi

10

slide-11
SLIDE 11

KKT CONDITIONS: NECESSITY AND SUFFICIENCY KKT CONDITIONS: NECESSITY AND SUFFICIENCY

Theorem (KKT necessity) If and are primal and dual solutions with zero duality gap, then and satisfy the KKT conditions. Theorem (KKT sufficiency) If the original problem is convex and and satisfy the KKT conditions, then is primal optimal, is dual optimal, and the duality gap is zero. If a constrained optimization problem is differentiable and convex KKT conditions are necessary and sufficient for primal/dual optimality (with zero duality gap) we can use the KKT conditions to find a solution to our optimization problem We’re in luck: the optimal so-margin hyperplane falls in this category!

x∗ ( , ) λ∗ μ∗ x∗ ( , ) λ∗ μ∗ x ~ ( , ) λ ~ μ ~ x ~ ( , ) λ ~ μ ~

11

slide-12
SLIDE 12

12

slide-13
SLIDE 13

13

 