Direct Multisearch for Multiobjective Optimization Ana Lusa Custdio 1 - - PowerPoint PPT Presentation

direct multisearch for multiobjective optimization
SMART_READER_LITE
LIVE PREVIEW

Direct Multisearch for Multiobjective Optimization Ana Lusa Custdio 1 - - PowerPoint PPT Presentation

Direct Multisearch for Multiobjective Optimization Ana Lusa Custdio 1 Jos F. Aguilar Madeira 2 A. Ismael F. Vaz 3 Lus Nunes Vicente 4 2 IDMEC-IST, ISEL 1 Universidade Nova de Lisboa 3 Universidade do Minho 4 Universidade de Coimbra


slide-1
SLIDE 1

Direct Multisearch for Multiobjective Optimization

Ana Luísa Custódio1 José F. Aguilar Madeira2

  • A. Ismael F. Vaz3

Luís Nunes Vicente4

1Universidade Nova de Lisboa

2IDMEC-IST, ISEL 3Universidade do Minho 4Universidade de Coimbra

Optimization 2011

July 24-27, 2011

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 1 / 53

slide-2
SLIDE 2

Outline

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 2 / 53

slide-3
SLIDE 3

Outline

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 2 / 53

slide-4
SLIDE 4

Outline

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 2 / 53

slide-5
SLIDE 5

Outline

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 2 / 53

slide-6
SLIDE 6

Outline

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 2 / 53

slide-7
SLIDE 7

Introduction and motivation

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 3 / 53

slide-8
SLIDE 8

Introduction and motivation

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞},j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 4 / 53

slide-9
SLIDE 9

Introduction and motivation

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞},j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 4 / 53

slide-10
SLIDE 10

Introduction and motivation

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞},j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 4 / 53

slide-11
SLIDE 11

Introduction and motivation

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞},j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 4 / 53

slide-12
SLIDE 12

Introduction and motivation

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞},j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 4 / 53

slide-13
SLIDE 13

Direct MultiSearch

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 5 / 53

slide-14
SLIDE 14

Direct MultiSearch

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure. Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 6 / 53

slide-15
SLIDE 15

Direct MultiSearch

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure. Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 6 / 53

slide-16
SLIDE 16

Direct MultiSearch

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure. Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 6 / 53

slide-17
SLIDE 17

Direct MultiSearch

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure. Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 6 / 53

slide-18
SLIDE 18

Direct MultiSearch

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure. Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 6 / 53

slide-19
SLIDE 19

Direct MultiSearch

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure. Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 6 / 53

slide-20
SLIDE 20

Direct MultiSearch

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure. Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 6 / 53

slide-21
SLIDE 21

Direct MultiSearch

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure. Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 6 / 53

slide-22
SLIDE 22

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 7 / 53

slide-23
SLIDE 23

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 8 / 53

slide-24
SLIDE 24

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 9 / 53

slide-25
SLIDE 25

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 10 / 53

slide-26
SLIDE 26

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 11 / 53

slide-27
SLIDE 27

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 12 / 53

slide-28
SLIDE 28

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 13 / 53

slide-29
SLIDE 29

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 14 / 53

slide-30
SLIDE 30

Direct MultiSearch

DMS example

x1 x2 x2 x1 f1 f2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 15 / 53

slide-31
SLIDE 31

Direct MultiSearch

DMS search & poll steps

At each iteration considers a list of feasible nondominated points ֒ → Lk Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 16 / 53

slide-32
SLIDE 32

Direct MultiSearch

DMS search & poll steps

At each iteration considers a list of feasible nondominated points ֒ → Lk Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 16 / 53

slide-33
SLIDE 33

Direct MultiSearch

DMS search & poll steps

At each iteration considers a list of feasible nondominated points ֒ → Lk Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 16 / 53

slide-34
SLIDE 34

Direct MultiSearch

DMS search & poll steps

At each iteration considers a list of feasible nondominated points ֒ → Lk Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 16 / 53

slide-35
SLIDE 35

Direct MultiSearch

DMS search & poll steps

At each iteration considers a list of feasible nondominated points ֒ → Lk Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 16 / 53

slide-36
SLIDE 36

Direct MultiSearch

Numerical Example — Problem SP1 [Huband et al.]

Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 17 / 53

slide-37
SLIDE 37

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 18 / 53

slide-38
SLIDE 38

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

  • Nondominated evaluated poll points.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 19 / 53

slide-39
SLIDE 39

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 20 / 53

slide-40
SLIDE 40

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 21 / 53

slide-41
SLIDE 41

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

  • Nondominated evaluated poll points.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 22 / 53

slide-42
SLIDE 42

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 23 / 53

slide-43
SLIDE 43

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 24 / 53

slide-44
SLIDE 44

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 25 / 53

slide-45
SLIDE 45

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 26 / 53

slide-46
SLIDE 46

Direct MultiSearch

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 27 / 53

slide-47
SLIDE 47

Direct MultiSearch

Refining subsequences and directions

For both globalization strategies (using the mesh or the forcing function in the search step), one also has: Theorem (existence of refining subsequences) There is at least a convergent subsequence of iterates {xk}k∈K corresponding to unsuccessful poll steps, such that αk − → 0 in K. Definition Let x∗ be the limit point of a convergent refining subsequence. Refining directions for x∗ are limit points of {dk/dk}k∈K where dk ∈ Dk and xk + αkdk ∈ Ω.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 28 / 53

slide-48
SLIDE 48

Direct MultiSearch

Refining subsequences and directions

For both globalization strategies (using the mesh or the forcing function in the search step), one also has: Theorem (existence of refining subsequences) There is at least a convergent subsequence of iterates {xk}k∈K corresponding to unsuccessful poll steps, such that αk − → 0 in K. Definition Let x∗ be the limit point of a convergent refining subsequence. Refining directions for x∗ are limit points of {dk/dk}k∈K where dk ∈ Dk and xk + αkdk ∈ Ω.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 28 / 53

slide-49
SLIDE 49

Direct MultiSearch

Refining subsequences and directions

For both globalization strategies (using the mesh or the forcing function in the search step), one also has: Theorem (existence of refining subsequences) There is at least a convergent subsequence of iterates {xk}k∈K corresponding to unsuccessful poll steps, such that αk − → 0 in K. Definition Let x∗ be the limit point of a convergent refining subsequence. Refining directions for x∗ are limit points of {dk/dk}k∈K where dk ∈ Dk and xk + αkdk ∈ Ω.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 28 / 53

slide-50
SLIDE 50

Direct MultiSearch

Pareto-Clarke critical point

Let us focus (for simplicity) on the unconstrained case, Ω = Rn. Definition x∗ is a Pareto-Clarke critical point of F (Lipschitz continuous near x∗) if ∀d ∈ Rn, ∃j = j(d), f◦

j (x∗; d) ≥ 0.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 29 / 53

slide-51
SLIDE 51

Direct MultiSearch

Analysis of DMS

Assumption {xk}k∈K refining subsequence converging to x∗. F Lipschitz continuous near x∗. Theorem If v is a refining direction for x∗ then ∃j = j(v) : f◦

j (x∗; v) ≥ 0.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 30 / 53

slide-52
SLIDE 52

Direct MultiSearch

Analysis of DMS

Assumption {xk}k∈K refining subsequence converging to x∗. F Lipschitz continuous near x∗. Theorem If v is a refining direction for x∗ then ∃j = j(v) : f◦

j (x∗; v) ≥ 0.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 30 / 53

slide-53
SLIDE 53

Direct MultiSearch

Analysis of DMS

Assumption {xk}k∈K refining subsequence converging to x∗. F Lipschitz continuous near x∗. Theorem If v is a refining direction for x∗ then ∃j = j(v) : f◦

j (x∗; v) ≥ 0.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 30 / 53

slide-54
SLIDE 54

Direct MultiSearch

Analysis of DMS

Assumption {xk}k∈K refining subsequence converging to x∗. F Lipschitz continuous near x∗. Theorem If v is a refining direction for x∗ then ∃j = j(v) : f◦

j (x∗; v) ≥ 0.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 30 / 53

slide-55
SLIDE 55

Direct MultiSearch

Convergence analysis of DMS

Theorem If the set of refining directions for x∗ is dense in Rn, then x∗ is a Pareto-Clarke critical point. Notes When m = 1, the presented results coincide with the ones reported for direct search. This convergence analysis is valid for multiobjective problems with general nonlinear constraints.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 31 / 53

slide-56
SLIDE 56

Direct MultiSearch

Convergence analysis of DMS

Theorem If the set of refining directions for x∗ is dense in Rn, then x∗ is a Pareto-Clarke critical point. Notes When m = 1, the presented results coincide with the ones reported for direct search. This convergence analysis is valid for multiobjective problems with general nonlinear constraints.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 31 / 53

slide-57
SLIDE 57

Direct MultiSearch

Convergence analysis of DMS

Theorem If the set of refining directions for x∗ is dense in Rn, then x∗ is a Pareto-Clarke critical point. Notes When m = 1, the presented results coincide with the ones reported for direct search. This convergence analysis is valid for multiobjective problems with general nonlinear constraints.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 31 / 53

slide-58
SLIDE 58

Direct MultiSearch

Convergence analysis of DMS

Theorem If the set of refining directions for x∗ is dense in Rn, then x∗ is a Pareto-Clarke critical point. Notes When m = 1, the presented results coincide with the ones reported for direct search. This convergence analysis is valid for multiobjective problems with general nonlinear constraints.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 31 / 53

slide-59
SLIDE 59

Numerical results

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 32 / 53

slide-60
SLIDE 60

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on mesh adaptive direct search algorithm. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 33 / 53

slide-61
SLIDE 61

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on mesh adaptive direct search algorithm. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 33 / 53

slide-62
SLIDE 62

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on mesh adaptive direct search algorithm. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 33 / 53

slide-63
SLIDE 63

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on mesh adaptive direct search algorithm. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 33 / 53

slide-64
SLIDE 64

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on mesh adaptive direct search algorithm. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 33 / 53

slide-65
SLIDE 65

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current feasible nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 34 / 53

slide-66
SLIDE 66

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current feasible nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 34 / 53

slide-67
SLIDE 67

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current feasible nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 34 / 53

slide-68
SLIDE 68

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current feasible nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 34 / 53

slide-69
SLIDE 69

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current feasible nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 34 / 53

slide-70
SLIDE 70

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current feasible nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 34 / 53

slide-71
SLIDE 71

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current feasible nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 34 / 53

slide-72
SLIDE 72

Numerical results

Performance metrics — Purity

Fp,s (approximated Pareto front computed by solver s for problem p). Fp (approximated Pareto front computed for problem p, using results for all solvers). Purity value for solver s on problem p: |Fp,s ∩ Fp| |Fp,s| .

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 35 / 53

slide-73
SLIDE 73

Numerical results

Comparing DMS to other solvers (Purity)

1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Purity performance profile

τ ρ

DMS(n,line) BIMADS

Purity Metric (percentage of points generated in the reference Pareto front)

tp,s = |Fp,s| |Fp,s ∩ Fp|

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 36 / 53

slide-74
SLIDE 74

Numerical results

Comparing DMS to other solvers (Purity)

10 20 30 40 50 60 70 80 90 100 110 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Purity performance profile with the best of 10 runs

τ ρ

DMS(n,line) AMOSA

200 400 600 800 1000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Purity Metric (percentage of points generated in the reference Pareto front)

tp,s = |Fp,s| |Fp,s ∩ Fp|

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 37 / 53

slide-75
SLIDE 75

Numerical results

Comparing DMS to other solvers (Purity)

0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Purity performance profile with the best of 10 runs

τ ρ

DMS(n,line) NSGA−II (C version)

20 40 60 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Purity Metric (percentage of points generated in the reference Pareto front)

tp,s = |Fp,s| |Fp,s ∩ Fp|

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 38 / 53

slide-76
SLIDE 76

Numerical results

Performance metrics — Spread

Gamma Metric (largest gap in the Pareto front) Γp,s = max

j∈{1,...,m}

  • max

i∈{0,...,N}{δi,j}

  • f1

f2 Computed extreme points Obtained points δN,1 δ0,1 δ0,2 δN−1,1 δN,2 δN−1,2 A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 39 / 53

slide-77
SLIDE 77

Numerical results

Comparing DMS to other solvers (Spread)

50 100 150 200 250 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Average Γ performance profile for 10 runs

τ ρ

DMS(n,line) BIMADS NSGA−II (C version) AMOSA

500 1000 1500 2000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Gamma Metric (largest gap in the Pareto front)

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 40 / 53

slide-78
SLIDE 78

Numerical results

Performance metrics — Spread

Delta Metric (uniformity of gaps in the Pareto front) ∆p,s = max

j∈{1,...,m}

  • δ0,j + δN,j + N−1

i=1 |δi,j − ¯

δj| δ0,j + δN,j + (N − 1)¯ δj

  • where ¯

δj, for j = 1, . . . , m, is the δi,j’s average.

f1 f2 Computed extreme points Obtained points δN,1 δ0,1 δ0,2 δN−1,1 δN,2 δN−1,2

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 41 / 53

slide-79
SLIDE 79

Numerical results

Comparing DMS to other solvers (Spread)

0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 6 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Average Δ performance profile for 10 runs

τ ρ

DMS(n,line) BIMADS NSGA−II (C version) AMOSA

1000 2000 3000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Delta Metric (uniformity of gaps in the Pareto front)

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 42 / 53

slide-80
SLIDE 80

Numerical results

Comparing DMS to other solvers

500 1000 1500 2000 2500 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Data profile with the best of 10 runs (ε=0.05)

σ ds(σ)

DMS(n,line) BIMADS NSGA−II (C version) AMOSA

# maximum function evaluations = 5000

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 43 / 53

slide-81
SLIDE 81

Numerical results

Comparing DMS to other solvers

500 1000 1500 2000 2500 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Data profile with the best of 10 runs (ε=0.5)

σ ds(σ)

DMS(n,line) BIMADS NSGA−II (C version) AMOSA

# maximum function evaluations = 5000

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 44 / 53

slide-82
SLIDE 82

Further improvements on DMS

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 45 / 53

slide-83
SLIDE 83

Further improvements on DMS

Comparing DMS to other solvers (Purity)

0.5 1 1.5 2 2.5 3 3.5 4 4.5 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Purity performance profile

τ ρ

DMS(n,line,cache,spread) BIMADS

Purity Metric (percentage of points generated in the reference Pareto front)

tp,s = |Fp,s| |Fp,s ∩ Fp|

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 46 / 53

slide-84
SLIDE 84

Further improvements on DMS

Comparing DMS to other solvers (Purity)

0.5 1 1.5 2 2.5 3 3.5 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Purity performance profile with the best of 10 runs

τ ρ

DMS(n,line,cache,spread) NSGA−II (C version)

10 20 30 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Purity Metric (percentage of points generated in the reference Pareto front)

tp,s = |Fp,s| |Fp,s ∩ Fp|

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 47 / 53

slide-85
SLIDE 85

Further improvements on DMS

Comparing DMS to other solvers (Spread)

50 100 150 200 250 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Average Γ performance profile for 10 runs

τ ρ

DMS(n,line) DMS(n,line,cache,spread) BIMADS NSGA−II (C version) AMOSA

500 1000 1500 2000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Gamma Metric (largest gap in the Pareto front)

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 48 / 53

slide-86
SLIDE 86

Further improvements on DMS

Comparing DMS to other solvers (Spread)

0.5 1 1.5 2 2.5 3 3.5 4 4.5 5 5.5 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Average Δ performance profile for 10 runs

τ ρ

DMS(n,line) DMS(n,line,cache,spread) BIMADS NSGA−II (C version) AMOSA

1000 2000 3000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Delta Metric (uniformity of gaps in the Pareto front)

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 49 / 53

slide-87
SLIDE 87

Further improvements on DMS

Comparing DMS to other solvers

500 1000 1500 2000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Data profile with the best of 10 runs (ε=0.05)

σ ds(σ)

DMS(n,line) DMS(n,line,cache,spread) BIMADS NSGA−II (C version) AMOSA

# maximum function evaluations = 5000

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 50 / 53

slide-88
SLIDE 88

Further improvements on DMS

Comparing DMS to other solvers

500 1000 1500 2000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Data profile with the best of 10 runs (ε=0.5)

σ ds(σ)

DMS(n,line) DMS(n,line,cache,spread) BIMADS NSGA−II (C version) AMOSA

# maximum function evaluations = 5000

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 51 / 53

slide-89
SLIDE 89

Conclusions and references

Outline

1

Introduction and motivation

2

Direct MultiSearch

3

Numerical results

4

Further improvements on DMS

5

Conclusions and references

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 52 / 53

slide-90
SLIDE 90

Conclusions and references

Conclusions and references

Development and analysis of a novel approach (Direct MultiSearch) for MOO, generalizing ALL direct-search methods. Direct MultiSearch (DMS) exhibits highly competitive numerical results for MOO. DMS (Matlab implementation) and problems (coded in AMPL) freely available at: http://www.mat.uc.pt/dms.

  • A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct

multisearch for multiobjective optimization, to appear, SIAM Journal on Optimization.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 53 / 53

slide-91
SLIDE 91

Conclusions and references

Conclusions and references

Development and analysis of a novel approach (Direct MultiSearch) for MOO, generalizing ALL direct-search methods. Direct MultiSearch (DMS) exhibits highly competitive numerical results for MOO. DMS (Matlab implementation) and problems (coded in AMPL) freely available at: http://www.mat.uc.pt/dms.

  • A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct

multisearch for multiobjective optimization, to appear, SIAM Journal on Optimization.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 53 / 53

slide-92
SLIDE 92

Conclusions and references

Conclusions and references

Development and analysis of a novel approach (Direct MultiSearch) for MOO, generalizing ALL direct-search methods. Direct MultiSearch (DMS) exhibits highly competitive numerical results for MOO. DMS (Matlab implementation) and problems (coded in AMPL) freely available at: http://www.mat.uc.pt/dms.

  • A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct

multisearch for multiobjective optimization, to appear, SIAM Journal on Optimization.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 53 / 53

slide-93
SLIDE 93

Conclusions and references

Conclusions and references

Development and analysis of a novel approach (Direct MultiSearch) for MOO, generalizing ALL direct-search methods. Direct MultiSearch (DMS) exhibits highly competitive numerical results for MOO. DMS (Matlab implementation) and problems (coded in AMPL) freely available at: http://www.mat.uc.pt/dms.

  • A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct

multisearch for multiobjective optimization, to appear, SIAM Journal on Optimization.

A.I.F. Vaz (Optimization 2011) DMS July 24-27, 2011 53 / 53