Dualit sans contraintes en optimisation globale Simon Boulmier - - PowerPoint PPT Presentation

dualit sans contraintes en optimisation globale
SMART_READER_LITE
LIVE PREVIEW

Dualit sans contraintes en optimisation globale Simon Boulmier - - PowerPoint PPT Presentation

Dualit sans contraintes en optimisation globale Simon Boulmier www.localsolver.com 1 | 28 Global optimization and lower bounds Mixed-integer nonlinear problem : Dual search Primal search v v f 2 | 28 min s.c. x R n f ( x ) l


slide-1
SLIDE 1

Dualité sans contraintes en optimisation globale

Simon Boulmier

www.localsolver.com

1 | 28

slide-2
SLIDE 2

Global optimization and lower bounds

Mixed-integer nonlinear problem : min

x∈Rn f(x)

s.c.            l ≤ x ≤ u, g(x) = 0, h(x) ≤ 0, xi ∈ Z, i ∈ I. f v∗ v v Primal search Dual search

2 | 28

slide-3
SLIDE 3

Table of content

Global optimization in a nutshell

I - Reformulation II - Bound tightening techniques III - Convex relaxations IV - Branch-and-bound

Reliability of lower bounds

I - Solving convex relaxations II - Unconstrained Wolfe duality

3 | 28

slide-4
SLIDE 4

Global optimization in a nutshell

I - Reformulation

4 | 28

slide-5
SLIDE 5

The optimal bucket

h R r

max

x∈R3

π 3 x3 ( x2

1 + x1x2 + x2 2

) s.c.            0 ≤ x1 ≤ 1 0 ≤ x2 ≤ 1 0 ≤ x3 ≤ 1 x2

1 + (x1 + x2)

√ (x1 − x2)2 + x2

3 ≤ 1

5 | 28

slide-6
SLIDE 6

Factorisation of the model

w 6 | 28

slide-7
SLIDE 7

The main idea [7]

7 | 28

slide-8
SLIDE 8

Factorisation of the model

8 | 28

slide-9
SLIDE 9

Global optimization in a nutshell

II - Bound tightening techniques

9 | 28

slide-10
SLIDE 10

The box constraint

y

x z

10 | 28

slide-11
SLIDE 11

Feasibility based bound tightening (FBBT)

x y lx ux uy ly x y lx ux uy ly x y lx ux uy ly x y lx ux uy ly

11 | 28

slide-12
SLIDE 12

Global optimization in a nutshell

III - Convex relaxations

12 | 28

slide-13
SLIDE 13

Convex relaxations [8, 2]

13 | 28

slide-14
SLIDE 14

Convex relaxations [4, 3, 1]

14 | 28

slide-15
SLIDE 15

Global optimization in a nutshell

IV - Branch-and-bound

15 | 28

slide-16
SLIDE 16

The branch-and-reduce algorithm [6]

y

x z

16 | 28

slide-17
SLIDE 17

The branch-and-reduce algorithm [6]

y

x z

16 | 28

slide-18
SLIDE 18

Summary

Reformulation Convex relaxations Bound tightening Branch-and-bound

17 | 28

slide-19
SLIDE 19

Solving the convex relaxations

Reliability of lower bounds

18 | 28

slide-20
SLIDE 20

Some issues with the solvers

◮ Smooth convex problem : min { f(x) | x ∈ Rn, g(x) ≤ 0 ∈ Rm} ◮ KKT certifjcate depends on L(x, µ) = f(x) + µTg(x) : Condition Exact Approximate Stationarity

x

x

x

x

statio

Feasibility gi x gi x

feas

Complementarity

igi x igi x compl

19 | 28

slide-21
SLIDE 21

Some issues with the solvers

◮ Smooth convex problem : min { f(x) | x ∈ Rn, g(x) ≤ 0 ∈ Rm} ◮ KKT certifjcate depends on L(x, µ) = f(x) + µTg(x) : Condition Exact Approximate Stationarity ∇xL(x, µ) = 0

x

x

statio

Feasibility gi(x) ≤ 0 gi x

feas

Complementarity µigi(x) = 0

igi x compl

19 | 28

slide-22
SLIDE 22

Some issues with the solvers

◮ Smooth convex problem : min { f(x) | x ∈ Rn, g(x) ≤ 0 ∈ Rm} ◮ KKT certifjcate depends on L(x, µ) = f(x) + µTg(x) : Condition Exact Approximate Stationarity ∇xL(x, µ) = 0 ∥∇xL(x, µ)∥ ≤ ϵstatio Feasibility gi(x) ≤ 0 gi(x) ≤ ϵfeas Complementarity µigi(x) = 0 |µigi(x)| ≤ ϵcompl

19 | 28

slide-23
SLIDE 23

Some issues with the solvers

◮ Smooth convex problem : min { f(x) | x ∈ Rn, g(x) ≤ 0 ∈ Rm} ◮ KKT certifjcate depends on L(x, µ) = f(x) + µTg(x) : Condition Exact Approximate Stationarity ∇xL(x, µ) = 0 ∥∇xL(x, µ)∥ ≤ ϵstatio Feasibility gi(x) ≤ 0 gi(x) ≤ ϵfeas Complementarity µigi(x) = 0 |µigi(x)| ≤ ϵcompl

19 | 28

slide-24
SLIDE 24

The solution trick (Neumaier [5])

20 | 28

slide-25
SLIDE 25

The solution trick (Neumaier [5])

20 | 28

slide-26
SLIDE 26

The solution trick (Neumaier [5])

20 | 28

slide-27
SLIDE 27

The solution trick (Neumaier [5])

20 | 28

slide-28
SLIDE 28

A duality result

Theorem

The Wolfe dual of the convex differentiable problem (P) min

x∈Rn f(x)

s.c. { ℓ ≤ x ≤ u is the unconstrained maximisation problem D max

x

x wih the dual function : x f x

i i

xi

if x

ui xi

if x

f x x T f x u x T f x

21 | 28

slide-29
SLIDE 29

A duality result

Theorem

The Wolfe dual of the convex differentiable problem (P) min

x∈Rn f(x)

s.c. { ℓ ≤ x ≤ u is the unconstrained maximisation problem (D) max

x

θ(x) wih the dual function : x f x

i i

xi

if x

ui xi

if x

f x x T f x u x T f x

21 | 28

slide-30
SLIDE 30

A duality result

Theorem

The Wolfe dual of the convex differentiable problem (P) min

x∈Rn f(x)

s.c. { ℓ ≤ x ≤ u is the unconstrained maximisation problem (D) max

x

θ(x) wih the dual function : θ(x) = f(x) + ∑

i

(ℓi − xi)∇if(x)+ + (ui − xi)∇if(x)− = f(x) + (ℓ − x)T∇f(x)+ + (u − x)T∇f(x)−

21 | 28

slide-31
SLIDE 31

Interpretation

22 | 28

slide-32
SLIDE 32

A duality result

Theorem

The Wolfe dual of the convex differentiable problem (P) min

x∈Rn f(x)

s.c. { ℓ ≤ x ≤ u, g(x) ≤ 0 is the bound constrained maximisation problem D max

x

x wih the dual function : x x x T x u x T x

23 | 28

slide-33
SLIDE 33

A duality result

Theorem

The Wolfe dual of the convex differentiable problem (P) min

x∈Rn f(x)

s.c. { ℓ ≤ x ≤ u, g(x) ≤ 0 is the bound constrained maximisation problem (D) max

x,µ≥0 θ(x, µ)

wih the dual function : x x x T x u x T x

23 | 28

slide-34
SLIDE 34

A duality result

Theorem

The Wolfe dual of the convex differentiable problem (P) min

x∈Rn f(x)

s.c. { ℓ ≤ x ≤ u, g(x) ≤ 0 is the bound constrained maximisation problem (D) max

x,µ≥0 θ(x, µ)

wih the dual function : θ(x, µ) = L(x, µ) + (ℓ − x)T∇L(x, µ)+ + (u − x)T∇L(x, µ)−

23 | 28

slide-35
SLIDE 35

Applications

◮ Robustness of dual bounds and inconsistency certifjcates Performance of some bound tightening techniques Solve min f x g x with augmented lagrangian

Sequence of box-constrained subproblems min x x u Approximate KKT : not enough, also use Defjne min x xk and max x xk % solver fails, % function evaluations, % newton systems solved

24 | 28

slide-36
SLIDE 36

Applications

◮ Robustness of dual bounds and inconsistency certifjcates ◮ Performance of some bound tightening techniques Solve min f x g x with augmented lagrangian

Sequence of box-constrained subproblems min x x u Approximate KKT : not enough, also use Defjne min x xk and max x xk % solver fails, % function evaluations, % newton systems solved

24 | 28

slide-37
SLIDE 37

Applications

◮ Robustness of dual bounds and inconsistency certifjcates ◮ Performance of some bound tightening techniques ◮ Solve min { f(x) | g(x) ≤ 0 } with augmented lagrangian

Sequence of box-constrained subproblems min x x u Approximate KKT : not enough, also use Defjne min x xk and max x xk % solver fails, % function evaluations, % newton systems solved

24 | 28

slide-38
SLIDE 38

Applications

◮ Robustness of dual bounds and inconsistency certifjcates ◮ Performance of some bound tightening techniques ◮ Solve min { f(x) | g(x) ≤ 0 } with augmented lagrangian

◮ Sequence of box-constrained subproblems min { Lρ(x) | ℓ ≤ x ≤ u } Approximate KKT : not enough, also use Defjne min x xk and max x xk % solver fails, % function evaluations, % newton systems solved

24 | 28

slide-39
SLIDE 39

Applications

◮ Robustness of dual bounds and inconsistency certifjcates ◮ Performance of some bound tightening techniques ◮ Solve min { f(x) | g(x) ≤ 0 } with augmented lagrangian

◮ Sequence of box-constrained subproblems min { Lρ(x) | ℓ ≤ x ≤ u } ◮ Approximate KKT : not enough, also use Lρ − Lρ ≤ ϵ Defjne min x xk and max x xk % solver fails, % function evaluations, % newton systems solved

24 | 28

slide-40
SLIDE 40

Applications

◮ Robustness of dual bounds and inconsistency certifjcates ◮ Performance of some bound tightening techniques ◮ Solve min { f(x) | g(x) ≤ 0 } with augmented lagrangian

◮ Sequence of box-constrained subproblems min { Lρ(x) | ℓ ≤ x ≤ u } ◮ Approximate KKT : not enough, also use Lρ − Lρ ≤ ϵ ◮ Defjne Lρ = min (Lρ(x1), · · · , Lρ(xk)) and Lρ = max (θ(x1), · · · , θ(xk)) % solver fails, % function evaluations, % newton systems solved

24 | 28

slide-41
SLIDE 41

Applications

◮ Robustness of dual bounds and inconsistency certifjcates ◮ Performance of some bound tightening techniques ◮ Solve min { f(x) | g(x) ≤ 0 } with augmented lagrangian

◮ Sequence of box-constrained subproblems min { Lρ(x) | ℓ ≤ x ≤ u } ◮ Approximate KKT : not enough, also use Lρ − Lρ ≤ ϵ ◮ Defjne Lρ = min (Lρ(x1), · · · , Lρ(xk)) and Lρ = max (θ(x1), · · · , θ(xk)) ◮ −68% solver fails, −30% function evaluations, −15% newton systems solved

24 | 28

slide-42
SLIDE 42

The end

Questions

25 | 28

slide-43
SLIDE 43

Bibliography I

  • C. S. Adjiman, S. Dallwig, C. A. Floudas, and A. Neumaier.

A global optimization method, αbb, for general twice-differentiable constrained nlps—i. theoretical advances. Computers & Chemical Engineering, 22(9):1137–1158, 1998.

  • P. Belotti, J. Lee, L. Liberti, F. Margot, and A. Wächter.

Branching and bounds tightening techniques for non-convex minlp. Optimization Methods & Software, 24(4-5):597–634, 2009.

  • A. Billionnet, S. Elloumi, and M.-C. Plateau.

Improving the performance of standard solvers for quadratic 0-1 programs by a tight convex reformulation: The qcr method. Discrete Applied Mathematics, 157(6):1185–1197, 2009.

  • L. Liberti and C. C. Pantelides.

Convex envelopes of monomials of odd degree. Journal of Global Optimization, 25(2):157–168, 2003.

  • A. Neumaier and O. Shcherbina.

Safe bounds in linear and mixed-integer linear programming. Mathematical Programming, 99(2):283–296, 2004.

  • H. S. Ryoo and N. V. Sahinidis.

A branch-and-reduce approach to global optimization. Journal of global optimization, 8(2):107–138, 1996.

  • E. M. Smith and C. C. Pantelides.

A symbolic reformulation/spatial branch-and-bound algorithm for the global optimisation of nonconvex minlps. Computers & Chemical Engineering, 23(4-5):457–478, 1999. 26 | 28

slide-44
SLIDE 44

Bibliography II

  • M. Tawarmalani and N. V. Sahinidis.

Convexifjcation and global optimization in continuous and mixed-integer nonlinear programming: theory, algorithms, software, and applications, volume 65. Springer Science & Business Media, 2002.

  • P. Wolfe.

A duality theorem for non-linear programming. Quarterly of applied mathematics, 19(3):239–244, 1961. 27 | 28

slide-45
SLIDE 45

Some issues with the solvers

◮ Smooth convex problem : min { f(x) | x ∈ Rn, g(x) ≤ 0 ∈ Rm} ◮ KKT certifjcate depends on L(x, µ) = f(x) + µTg(x) : Condition Exact Approximate Stationarity ∇xL(x, µ) = 0 ∥∇xL(x, µ)∥ ≤ ϵstatio Feasibility gi(x) ≤ 0 gi(x) ≤ ϵfeas Complementarity µigi(x) = 0 |µigi(x)| ≤ ϵcompl Primal Lagrangian dual Wolfe dual [9] min

x

f x s.c. g x max min

x

f x

Tg x

max

x

f x

Tg x

s.c. f x

i i

gi x

28 | 28

slide-46
SLIDE 46

Some issues with the solvers

◮ Smooth convex problem : min { f(x) | x ∈ Rn, g(x) ≤ 0 ∈ Rm} ◮ KKT certifjcate depends on L(x, µ) = f(x) + µTg(x) : Condition Exact Approximate Stationarity ∇xL(x, µ) = 0 ∥∇xL(x, µ)∥ ≤ ϵstatio Feasibility gi(x) ≤ 0 gi(x) ≤ ϵfeas Complementarity µigi(x) = 0 |µigi(x)| ≤ ϵcompl Primal Lagrangian dual Wolfe dual [9] min

x

f(x) s.c. { g(x) ≤ 0 max

µ≥0 min x

f(x)+µTg(x) max

x,µ≥0 f(x) + µTg(x)

s.c. { ∇f(x) + ∑

i µi∇gi(x) = 0

28 | 28