Direct multisearch for multiobjective optimization A. Ismael F. Vaz - - PowerPoint PPT Presentation

direct multisearch for multiobjective optimization
SMART_READER_LITE
LIVE PREVIEW

Direct multisearch for multiobjective optimization A. Ismael F. Vaz - - PowerPoint PPT Presentation

Direct multisearch for multiobjective optimization A. Ismael F. Vaz University of Minho - Portugal aivaz@dps.uminho.pt Joint work with A. L. Custdio, J. F. A. Madeira, and L. N. Vicente Southampton October 21, 2010 A.I.F. Vaz (UMinho) DMS


slide-1
SLIDE 1

Direct multisearch for multiobjective optimization

  • A. Ismael F. Vaz

University of Minho - Portugal aivaz@dps.uminho.pt Joint work with A. L. Custódio, J. F. A. Madeira, and L. N. Vicente

Southampton October 21, 2010

A.I.F. Vaz (UMinho) DMS October 21, 2010 1 / 64

slide-2
SLIDE 2

Outline

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 2 / 64

slide-3
SLIDE 3

Outline

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 2 / 64

slide-4
SLIDE 4

Outline

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 2 / 64

slide-5
SLIDE 5

Outline

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 2 / 64

slide-6
SLIDE 6

Outline

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 2 / 64

slide-7
SLIDE 7

Outline

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 2 / 64

slide-8
SLIDE 8

Introduction

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 3 / 64

slide-9
SLIDE 9

Introduction

Why Derivative-Free Optimization?

Some of the reasons to apply derivative-free optimization are the following: Nowadays computer hardware and mathematical algorithms allows increasingly large simulations. Functions are noisy (one cannot trust derivatives or approximate them by finite differences). Binary codes (source code not available) and random simulations — making automatic differentiation impossible to apply. Legacy codes (written in the past and not maintained by the original authors). Lack of sophistication of the user (users need improvement but want to use something simple).

A.I.F. Vaz (UMinho) DMS October 21, 2010 4 / 64

slide-10
SLIDE 10

Introduction

Why Derivative-Free Optimization?

Some of the reasons to apply derivative-free optimization are the following: Nowadays computer hardware and mathematical algorithms allows increasingly large simulations. Functions are noisy (one cannot trust derivatives or approximate them by finite differences). Binary codes (source code not available) and random simulations — making automatic differentiation impossible to apply. Legacy codes (written in the past and not maintained by the original authors). Lack of sophistication of the user (users need improvement but want to use something simple).

A.I.F. Vaz (UMinho) DMS October 21, 2010 4 / 64

slide-11
SLIDE 11

Introduction

Why Derivative-Free Optimization?

Some of the reasons to apply derivative-free optimization are the following: Nowadays computer hardware and mathematical algorithms allows increasingly large simulations. Functions are noisy (one cannot trust derivatives or approximate them by finite differences). Binary codes (source code not available) and random simulations — making automatic differentiation impossible to apply. Legacy codes (written in the past and not maintained by the original authors). Lack of sophistication of the user (users need improvement but want to use something simple).

A.I.F. Vaz (UMinho) DMS October 21, 2010 4 / 64

slide-12
SLIDE 12

Introduction

Why Derivative-Free Optimization?

Some of the reasons to apply derivative-free optimization are the following: Nowadays computer hardware and mathematical algorithms allows increasingly large simulations. Functions are noisy (one cannot trust derivatives or approximate them by finite differences). Binary codes (source code not available) and random simulations — making automatic differentiation impossible to apply. Legacy codes (written in the past and not maintained by the original authors). Lack of sophistication of the user (users need improvement but want to use something simple).

A.I.F. Vaz (UMinho) DMS October 21, 2010 4 / 64

slide-13
SLIDE 13

Introduction

Why Derivative-Free Optimization?

Some of the reasons to apply derivative-free optimization are the following: Nowadays computer hardware and mathematical algorithms allows increasingly large simulations. Functions are noisy (one cannot trust derivatives or approximate them by finite differences). Binary codes (source code not available) and random simulations — making automatic differentiation impossible to apply. Legacy codes (written in the past and not maintained by the original authors). Lack of sophistication of the user (users need improvement but want to use something simple).

A.I.F. Vaz (UMinho) DMS October 21, 2010 4 / 64

slide-14
SLIDE 14

Direct search

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 5 / 64

slide-15
SLIDE 15

Direct search

Direct-search methods

Definition Sample the objective function at a finite number of points at each iteration. Base actions on those function values. Do not depend on derivative approximation or model building. Direct search of directional type: Achieve descent by using positive spanning sets and moving in the directions of the best points.

A.I.F. Vaz (UMinho) DMS October 21, 2010 6 / 64

slide-16
SLIDE 16

Direct search

Direct-search methods

Definition Sample the objective function at a finite number of points at each iteration. Base actions on those function values. Do not depend on derivative approximation or model building. Direct search of directional type: Achieve descent by using positive spanning sets and moving in the directions of the best points.

A.I.F. Vaz (UMinho) DMS October 21, 2010 6 / 64

slide-17
SLIDE 17

Direct search

Direct-search methods

Definition Sample the objective function at a finite number of points at each iteration. Base actions on those function values. Do not depend on derivative approximation or model building. Direct search of directional type: Achieve descent by using positive spanning sets and moving in the directions of the best points.

A.I.F. Vaz (UMinho) DMS October 21, 2010 6 / 64

slide-18
SLIDE 18

Direct search

Direct-search methods

Definition Sample the objective function at a finite number of points at each iteration. Base actions on those function values. Do not depend on derivative approximation or model building. Direct search of directional type: Achieve descent by using positive spanning sets and moving in the directions of the best points.

A.I.F. Vaz (UMinho) DMS October 21, 2010 6 / 64

slide-19
SLIDE 19

Direct search

Direct-search methods

Definition Sample the objective function at a finite number of points at each iteration. Base actions on those function values. Do not depend on derivative approximation or model building. Direct search of directional type: Achieve descent by using positive spanning sets and moving in the directions of the best points.

A.I.F. Vaz (UMinho) DMS October 21, 2010 6 / 64

slide-20
SLIDE 20

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-21
SLIDE 21

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-22
SLIDE 22

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-23
SLIDE 23

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-24
SLIDE 24

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-25
SLIDE 25

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-26
SLIDE 26

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-27
SLIDE 27

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-28
SLIDE 28

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-29
SLIDE 29

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-30
SLIDE 30

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-31
SLIDE 31

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-32
SLIDE 32

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-33
SLIDE 33

Direct search

Coordinate search (poll step & simple decrease)

A.I.F. Vaz (UMinho) DMS October 21, 2010 7 / 64

slide-34
SLIDE 34

Direct search for single objective

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 8 / 64

slide-35
SLIDE 35

Direct search for single objective

Derivative-free optimization

Problem formulation (single objective) min

x∈Ω f(x)

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} f : Rn → R ∪ {+∞}, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n We aim at solving this problem without using derivatives of f.

A.I.F. Vaz (UMinho) DMS October 21, 2010 9 / 64

slide-36
SLIDE 36

Direct search for single objective

Some definitions

Positive spanning set Is a set of vectors that spans Rn with nonnegative coefficients. Examples D⊕ = {e1, . . . , en, −e1, . . . , −en} D⊗ = {e1, . . . , en, −e1, . . . , −en, e, −e} Extreme barrier function fΩ(x) = f(x) if x ∈ Ω, +∞

  • therwise.

A.I.F. Vaz (UMinho) DMS October 21, 2010 10 / 64

slide-37
SLIDE 37

Direct search for single objective

Some definitions

Positive spanning set Is a set of vectors that spans Rn with nonnegative coefficients. Examples D⊕ = {e1, . . . , en, −e1, . . . , −en} D⊗ = {e1, . . . , en, −e1, . . . , −en, e, −e} Extreme barrier function fΩ(x) = f(x) if x ∈ Ω, +∞

  • therwise.

A.I.F. Vaz (UMinho) DMS October 21, 2010 10 / 64

slide-38
SLIDE 38

Direct search for single objective

Some definitions

Positive spanning set Is a set of vectors that spans Rn with nonnegative coefficients. Examples D⊕ = {e1, . . . , en, −e1, . . . , −en} D⊗ = {e1, . . . , en, −e1, . . . , −en, e, −e} Extreme barrier function fΩ(x) = f(x) if x ∈ Ω, +∞

  • therwise.

A.I.F. Vaz (UMinho) DMS October 21, 2010 10 / 64

slide-39
SLIDE 39

Direct search for single objective

A direct-search method

(0) Initialization Choose x0 ∈ Ω, α0 > 0. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Search step (Optional) Try to compute a point x, using a finite number of trial points, in the grid Mk =

  • xk + αkDkz, z ∈ N|Dk|
  • with Dk ⊆ D and fΩ(x) < f(xk).

If fΩ(x) < f(xk) then set xk+1 = x, declare the iteration and the search step successful, and skip the poll step.

A.I.F. Vaz (UMinho) DMS October 21, 2010 11 / 64

slide-40
SLIDE 40

Direct search for single objective

A direct-search method

(0) Initialization Choose x0 ∈ Ω, α0 > 0. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Search step (Optional) Try to compute a point x, using a finite number of trial points, in the grid Mk =

  • xk + αkDkz, z ∈ N|Dk|
  • with Dk ⊆ D and fΩ(x) < f(xk).

If fΩ(x) < f(xk) then set xk+1 = x, declare the iteration and the search step successful, and skip the poll step.

A.I.F. Vaz (UMinho) DMS October 21, 2010 11 / 64

slide-41
SLIDE 41

Direct search for single objective

A direct-search method

(0) Initialization Choose x0 ∈ Ω, α0 > 0. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Search step (Optional) Try to compute a point x, using a finite number of trial points, in the grid Mk =

  • xk + αkDkz, z ∈ N|Dk|
  • with Dk ⊆ D and fΩ(x) < f(xk).

If fΩ(x) < f(xk) then set xk+1 = x, declare the iteration and the search step successful, and skip the poll step.

A.I.F. Vaz (UMinho) DMS October 21, 2010 11 / 64

slide-42
SLIDE 42

Direct search for single objective

A direct-search method

(0) Initialization Choose x0 ∈ Ω, α0 > 0. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Search step (Optional) Try to compute a point x, using a finite number of trial points, in the grid Mk =

  • xk + αkDkz, z ∈ N|Dk|
  • with Dk ⊆ D and fΩ(x) < f(xk).

If fΩ(x) < f(xk) then set xk+1 = x, declare the iteration and the search step successful, and skip the poll step.

A.I.F. Vaz (UMinho) DMS October 21, 2010 11 / 64

slide-43
SLIDE 43

Direct search for single objective

A direct-search method

(0) Initialization Choose x0 ∈ Ω, α0 > 0. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Search step (Optional) Try to compute a point x, using a finite number of trial points, in the grid Mk =

  • xk + αkDkz, z ∈ N|Dk|
  • with Dk ⊆ D and fΩ(x) < f(xk).

If fΩ(x) < f(xk) then set xk+1 = x, declare the iteration and the search step successful, and skip the poll step.

A.I.F. Vaz (UMinho) DMS October 21, 2010 11 / 64

slide-44
SLIDE 44

Direct search for single objective

A direct-search method

(2) Poll step Optionally order the poll set Pk = {xk + αkd, d ∈ Dk} with Dk ⊆ D. If a poll point xk + αkdk is found such that fΩ(xk + αkdk) < f(xk) then stop polling, set xk+1 = xk + αkdk, and declare the iteration and the poll step successful. Otherwise declare the iteration (and the poll step) unsuccessful and set xk+1 = xk.

A.I.F. Vaz (UMinho) DMS October 21, 2010 12 / 64

slide-45
SLIDE 45

Direct search for single objective

A direct-search method

(2) Poll step Optionally order the poll set Pk = {xk + αkd, d ∈ Dk} with Dk ⊆ D. If a poll point xk + αkdk is found such that fΩ(xk + αkdk) < f(xk) then stop polling, set xk+1 = xk + αkdk, and declare the iteration and the poll step successful. Otherwise declare the iteration (and the poll step) unsuccessful and set xk+1 = xk.

A.I.F. Vaz (UMinho) DMS October 21, 2010 12 / 64

slide-46
SLIDE 46

Direct search for single objective

A direct-search method

(2) Poll step Optionally order the poll set Pk = {xk + αkd, d ∈ Dk} with Dk ⊆ D. If a poll point xk + αkdk is found such that fΩ(xk + αkdk) < f(xk) then stop polling, set xk+1 = xk + αkdk, and declare the iteration and the poll step successful. Otherwise declare the iteration (and the poll step) unsuccessful and set xk+1 = xk.

A.I.F. Vaz (UMinho) DMS October 21, 2010 12 / 64

slide-47
SLIDE 47

Direct search for single objective

A direct-search method

(3) Step size update: If the iteration was successful then maintain the step size parameter (αk+1 = αk) or double it (αk+1 = 2αk) after two consecutive poll successes along the same direction. If the iteration was unsuccessful, halve the step size parameter (αk+1 = αk/2).

A.I.F. Vaz (UMinho) DMS October 21, 2010 13 / 64

slide-48
SLIDE 48

Direct search for single objective

A direct-search method

(3) Step size update: If the iteration was successful then maintain the step size parameter (αk+1 = αk) or double it (αk+1 = 2αk) after two consecutive poll successes along the same direction. If the iteration was unsuccessful, halve the step size parameter (αk+1 = αk/2).

A.I.F. Vaz (UMinho) DMS October 21, 2010 13 / 64

slide-49
SLIDE 49

Direct search for single objective

Some comments

We could present the previous algorithm in a different form, namely by fixing the set Dk (Dk = D, ∀k) not to change with the iteration number (problem with only bound constraints). allowing the set Dk to be computed in a way to conform with possible linear constraints. to use a forcing function ρ(·) (e.g., ρ(t) = t2) instead of a integer lattice (the mesh Mk). A forcing function ρ(·) is continuous, positive, and satisfies limt−

→0+ ρ(t)/t = 0 and ρ(t1) ≤ ρ(t2) if t1 < t2.

A point x is accepted (successful) in the search step if fΩ(x) < f(xk) − ρ(αk).

A.I.F. Vaz (UMinho) DMS October 21, 2010 14 / 64

slide-50
SLIDE 50

Direct search for single objective

Some comments

We could present the previous algorithm in a different form, namely by fixing the set Dk (Dk = D, ∀k) not to change with the iteration number (problem with only bound constraints). allowing the set Dk to be computed in a way to conform with possible linear constraints. to use a forcing function ρ(·) (e.g., ρ(t) = t2) instead of a integer lattice (the mesh Mk). A forcing function ρ(·) is continuous, positive, and satisfies limt−

→0+ ρ(t)/t = 0 and ρ(t1) ≤ ρ(t2) if t1 < t2.

A point x is accepted (successful) in the search step if fΩ(x) < f(xk) − ρ(αk).

A.I.F. Vaz (UMinho) DMS October 21, 2010 14 / 64

slide-51
SLIDE 51

Direct search for single objective

Some comments

We could present the previous algorithm in a different form, namely by fixing the set Dk (Dk = D, ∀k) not to change with the iteration number (problem with only bound constraints). allowing the set Dk to be computed in a way to conform with possible linear constraints. to use a forcing function ρ(·) (e.g., ρ(t) = t2) instead of a integer lattice (the mesh Mk). A forcing function ρ(·) is continuous, positive, and satisfies limt−

→0+ ρ(t)/t = 0 and ρ(t1) ≤ ρ(t2) if t1 < t2.

A point x is accepted (successful) in the search step if fΩ(x) < f(xk) − ρ(αk).

A.I.F. Vaz (UMinho) DMS October 21, 2010 14 / 64

slide-52
SLIDE 52

Direct search for single objective

Some comments

We could present the previous algorithm in a different form, namely by fixing the set Dk (Dk = D, ∀k) not to change with the iteration number (problem with only bound constraints). allowing the set Dk to be computed in a way to conform with possible linear constraints. to use a forcing function ρ(·) (e.g., ρ(t) = t2) instead of a integer lattice (the mesh Mk). A forcing function ρ(·) is continuous, positive, and satisfies limt−

→0+ ρ(t)/t = 0 and ρ(t1) ≤ ρ(t2) if t1 < t2.

A point x is accepted (successful) in the search step if fΩ(x) < f(xk) − ρ(αk). The presented algorithm just suits for the multiobjective version to be described.

A.I.F. Vaz (UMinho) DMS October 21, 2010 14 / 64

slide-53
SLIDE 53

Direct search for multiobjective

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 15 / 64

slide-54
SLIDE 54

Direct search for multiobjective

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞}, j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (UMinho) DMS October 21, 2010 16 / 64

slide-55
SLIDE 55

Direct search for multiobjective

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞}, j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (UMinho) DMS October 21, 2010 16 / 64

slide-56
SLIDE 56

Direct search for multiobjective

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞}, j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (UMinho) DMS October 21, 2010 16 / 64

slide-57
SLIDE 57

Direct search for multiobjective

Derivative-free multiobjective optimization

MOO problem min

x∈Ω F(x) ≡ (f1(x), f2(x), . . . , fm(x))⊤

where Ω = {x ∈ Rn : ℓ ≤ x ≤ u} fj : Rn → R ∪ {+∞}, j = 1, . . . , m, ℓ ∈ (R ∪ {−∞})n and u ∈ (R ∪ {+∞})n Several objectives, often conflicting. Functions with unknown derivatives. Expensive function evaluations, possibly subject to noise. Impractical to compute approximations to derivatives.

A.I.F. Vaz (UMinho) DMS October 21, 2010 16 / 64

slide-58
SLIDE 58

Direct search for multiobjective

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure.

A.I.F. Vaz (UMinho) DMS October 21, 2010 17 / 64

slide-59
SLIDE 59

Direct search for multiobjective

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure.

A.I.F. Vaz (UMinho) DMS October 21, 2010 17 / 64

slide-60
SLIDE 60

Direct search for multiobjective

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure.

A.I.F. Vaz (UMinho) DMS October 21, 2010 17 / 64

slide-61
SLIDE 61

Direct search for multiobjective

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure.

A.I.F. Vaz (UMinho) DMS October 21, 2010 17 / 64

slide-62
SLIDE 62

Direct search for multiobjective

DMS algorithmic main lines

Does not aggregate any of the objective functions. Generalizes ALL direct-search methods of directional type to MOO. Makes use of search/poll paradigm. Implements an optional search step (only to disseminate the search). Tries to capture the whole Pareto front from the polling procedure.

A.I.F. Vaz (UMinho) DMS October 21, 2010 17 / 64

slide-63
SLIDE 63

Direct search for multiobjective

DMS algorithmic main lines

Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (UMinho) DMS October 21, 2010 18 / 64

slide-64
SLIDE 64

Direct search for multiobjective

DMS algorithmic main lines

Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (UMinho) DMS October 21, 2010 18 / 64

slide-65
SLIDE 65

Direct search for multiobjective

DMS algorithmic main lines

Keeps a list of feasible nondominated points. Poll centers are chosen from the list. Successful iterations correspond to list changes.

A.I.F. Vaz (UMinho) DMS October 21, 2010 18 / 64

slide-66
SLIDE 66

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 19 / 64

slide-67
SLIDE 67

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 20 / 64

slide-68
SLIDE 68

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 21 / 64

slide-69
SLIDE 69

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 22 / 64

slide-70
SLIDE 70

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 23 / 64

slide-71
SLIDE 71

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 24 / 64

slide-72
SLIDE 72

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 25 / 64

slide-73
SLIDE 73

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 26 / 64

slide-74
SLIDE 74

Direct search for multiobjective

DMS example

A.I.F. Vaz (UMinho) DMS October 21, 2010 27 / 64

slide-75
SLIDE 75

Direct search for multiobjective

DMS search & poll steps

Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (UMinho) DMS October 21, 2010 28 / 64

slide-76
SLIDE 76

Direct search for multiobjective

DMS search & poll steps

Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (UMinho) DMS October 21, 2010 28 / 64

slide-77
SLIDE 77

Direct search for multiobjective

DMS search & poll steps

Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (UMinho) DMS October 21, 2010 28 / 64

slide-78
SLIDE 78

Direct search for multiobjective

DMS search & poll steps

Evaluate a finite set of feasible points ֒ → Ladd. Remove dominated points from Lk ∪ Ladd ֒ → Lfiltered. Select list of feasible nondominated points ֒ → Ltrial. Compare Ltrial to Lk (success if Ltrial = Lk, unsuccess otherwise).

A.I.F. Vaz (UMinho) DMS October 21, 2010 28 / 64

slide-79
SLIDE 79

Direct search for multiobjective

Direct MultiSearch for MOO

(0) Initialization Choose x0 ∈ Ω with F(x0) < +∞, α0 > 0. Set L0 = {(x0; α0)}. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Selection of iterate point Order Lk and select (xk; αk) ∈ Lk.

A.I.F. Vaz (UMinho) DMS October 21, 2010 29 / 64

slide-80
SLIDE 80

Direct search for multiobjective

Direct MultiSearch for MOO

(0) Initialization Choose x0 ∈ Ω with F(x0) < +∞, α0 > 0. Set L0 = {(x0; α0)}. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Selection of iterate point Order Lk and select (xk; αk) ∈ Lk.

A.I.F. Vaz (UMinho) DMS October 21, 2010 29 / 64

slide-81
SLIDE 81

Direct search for multiobjective

Direct MultiSearch for MOO

(0) Initialization Choose x0 ∈ Ω with F(x0) < +∞, α0 > 0. Set L0 = {(x0; α0)}. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Selection of iterate point Order Lk and select (xk; αk) ∈ Lk.

A.I.F. Vaz (UMinho) DMS October 21, 2010 29 / 64

slide-82
SLIDE 82

Direct search for multiobjective

Direct MultiSearch for MOO

(0) Initialization Choose x0 ∈ Ω with F(x0) < +∞, α0 > 0. Set L0 = {(x0; α0)}. Let D be a (possibly infinite) set of positive spanning sets. For k = 0, 1, 2, . . . (1) Selection of iterate point Order Lk and select (xk; αk) ∈ Lk.

A.I.F. Vaz (UMinho) DMS October 21, 2010 29 / 64

slide-83
SLIDE 83

Direct search for multiobjective

Direct MultiSearch for MOO

(2) Search step (Optional) Evaluate a finite set of points Ladd = {(zs; αk)}s∈S (in the mesh or using a forcing function). (Lk;Ladd) ֒ → Lfiltered ֒ → Ltrial If success is achieved then set Lk+1 = Ltrial, declare the iteration and the search step successful, and skip the poll step.

A.I.F. Vaz (UMinho) DMS October 21, 2010 30 / 64

slide-84
SLIDE 84

Direct search for multiobjective

Direct MultiSearch for MOO

(2) Search step (Optional) Evaluate a finite set of points Ladd = {(zs; αk)}s∈S (in the mesh or using a forcing function). (Lk;Ladd) ֒ → Lfiltered ֒ → Ltrial If success is achieved then set Lk+1 = Ltrial, declare the iteration and the search step successful, and skip the poll step.

A.I.F. Vaz (UMinho) DMS October 21, 2010 30 / 64

slide-85
SLIDE 85

Direct search for multiobjective

Direct MultiSearch for MOO

(3) Poll step Evaluate Ladd = {(xk + αkd; αk), d ∈ Dk}, with Dk ⊆ D (Lk;Ladd) ֒ → Lfiltered ֒ → Ltrial If success is achieved then set Lk+1 = Ltrial, declare the iteration and the poll step successful Otherwise declare the iteration (and the poll step) unsuccessful and set Lk+1 = Ltrial

A.I.F. Vaz (UMinho) DMS October 21, 2010 31 / 64

slide-86
SLIDE 86

Direct search for multiobjective

Direct MultiSearch for MOO

(3) Poll step Evaluate Ladd = {(xk + αkd; αk), d ∈ Dk}, with Dk ⊆ D (Lk;Ladd) ֒ → Lfiltered ֒ → Ltrial If success is achieved then set Lk+1 = Ltrial, declare the iteration and the poll step successful Otherwise declare the iteration (and the poll step) unsuccessful and set Lk+1 = Ltrial

A.I.F. Vaz (UMinho) DMS October 21, 2010 31 / 64

slide-87
SLIDE 87

Direct search for multiobjective

Direct MultiSearch for MOO

(3) Poll step Evaluate Ladd = {(xk + αkd; αk), d ∈ Dk}, with Dk ⊆ D (Lk;Ladd) ֒ → Lfiltered ֒ → Ltrial If success is achieved then set Lk+1 = Ltrial, declare the iteration and the poll step successful Otherwise declare the iteration (and the poll step) unsuccessful and set Lk+1 = Ltrial

A.I.F. Vaz (UMinho) DMS October 21, 2010 31 / 64

slide-88
SLIDE 88

Direct search for multiobjective

Direct MultiSearch for MOO

(4) Step size update: If the iteration was successful then maintain the step size parameter (αk+1 = αk) or double it (αk+1 = 2αk) after two consecutive poll successes along the same direction. If the iteration was unsuccessful, halve the step size parameter (αk+1 = αk/2).

A.I.F. Vaz (UMinho) DMS October 21, 2010 32 / 64

slide-89
SLIDE 89

Direct search for multiobjective

Direct MultiSearch for MOO

(4) Step size update: If the iteration was successful then maintain the step size parameter (αk+1 = αk) or double it (αk+1 = 2αk) after two consecutive poll successes along the same direction. If the iteration was unsuccessful, halve the step size parameter (αk+1 = αk/2).

A.I.F. Vaz (UMinho) DMS October 21, 2010 32 / 64

slide-90
SLIDE 90

Direct search for multiobjective

Numerical Example — Problem SP1 [Huband et al.]

Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (UMinho) DMS October 21, 2010 33 / 64

slide-91
SLIDE 91

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning.

A.I.F. Vaz (UMinho) DMS October 21, 2010 34 / 64

slide-92
SLIDE 92

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

  • Nondominated evaluated poll points.

A.I.F. Vaz (UMinho) DMS October 21, 2010 35 / 64

slide-93
SLIDE 93

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (UMinho) DMS October 21, 2010 36 / 64

slide-94
SLIDE 94

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning.

A.I.F. Vaz (UMinho) DMS October 21, 2010 37 / 64

slide-95
SLIDE 95

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

  • Nondominated evaluated poll points.

A.I.F. Vaz (UMinho) DMS October 21, 2010 38 / 64

slide-96
SLIDE 96

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (UMinho) DMS October 21, 2010 39 / 64

slide-97
SLIDE 97

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (UMinho) DMS October 21, 2010 40 / 64

slide-98
SLIDE 98

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (UMinho) DMS October 21, 2010 41 / 64

slide-99
SLIDE 99

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (UMinho) DMS October 21, 2010 42 / 64

slide-100
SLIDE 100

Direct search for multiobjective

Numerical example — problem SP1 [Huband et al.]

Evaluated poll points. Evaluated points since beginning. Current iterate list.

A.I.F. Vaz (UMinho) DMS October 21, 2010 43 / 64

slide-101
SLIDE 101

Direct search for multiobjective

Refining subsequences and directions

For both globalization strategies (using the mesh or the forcing function in the search step), one also has: Theorem (existence of refining subsequences) There is at least a convergent subsequence of iterates {xk}k∈K corresponding to unsuccessful poll steps, such that αk − → 0 in K. Definition Let x∗ be the limit point of a convergent refining subsequence. Refining directions for x∗ are limit points of {dk/dk}k∈K where dk ∈ Dk and xk + αkdk ∈ Ω.

A.I.F. Vaz (UMinho) DMS October 21, 2010 44 / 64

slide-102
SLIDE 102

Direct search for multiobjective

Refining subsequences and directions

For both globalization strategies (using the mesh or the forcing function in the search step), one also has: Theorem (existence of refining subsequences) There is at least a convergent subsequence of iterates {xk}k∈K corresponding to unsuccessful poll steps, such that αk − → 0 in K. Definition Let x∗ be the limit point of a convergent refining subsequence. Refining directions for x∗ are limit points of {dk/dk}k∈K where dk ∈ Dk and xk + αkdk ∈ Ω.

A.I.F. Vaz (UMinho) DMS October 21, 2010 44 / 64

slide-103
SLIDE 103

Direct search for multiobjective

Refining subsequences and directions

For both globalization strategies (using the mesh or the forcing function in the search step), one also has: Theorem (existence of refining subsequences) There is at least a convergent subsequence of iterates {xk}k∈K corresponding to unsuccessful poll steps, such that αk − → 0 in K. Definition Let x∗ be the limit point of a convergent refining subsequence. Refining directions for x∗ are limit points of {dk/dk}k∈K where dk ∈ Dk and xk + αkdk ∈ Ω.

A.I.F. Vaz (UMinho) DMS October 21, 2010 44 / 64

slide-104
SLIDE 104

Direct search for multiobjective

Pareto-Clarke critical point

Let us focus (again for simplicity) on the unconstrained case, Ω = Rn. Definition x∗ is a Pareto-Clarke critical point of F (Lipschitz continuous near x∗) if ∀d ∈ Rn, ∃j = j(d), f◦

j (x∗; d) ≥ 0.

A.I.F. Vaz (UMinho) DMS October 21, 2010 45 / 64

slide-105
SLIDE 105

Direct search for multiobjective

Analysis of DMS

Assumption {xk}k∈K refining subsequence converging to x∗. F Lipschitz continuous near x∗. Theorem If v is a refining direction for x∗ then ∃j = j(d) : f◦

j (x∗; d) ≥ 0.

A.I.F. Vaz (UMinho) DMS October 21, 2010 46 / 64

slide-106
SLIDE 106

Direct search for multiobjective

Analysis of DMS

Assumption {xk}k∈K refining subsequence converging to x∗. F Lipschitz continuous near x∗. Theorem If v is a refining direction for x∗ then ∃j = j(d) : f◦

j (x∗; d) ≥ 0.

A.I.F. Vaz (UMinho) DMS October 21, 2010 46 / 64

slide-107
SLIDE 107

Direct search for multiobjective

Analysis of DMS

Assumption {xk}k∈K refining subsequence converging to x∗. F Lipschitz continuous near x∗. Theorem If v is a refining direction for x∗ then ∃j = j(d) : f◦

j (x∗; d) ≥ 0.

A.I.F. Vaz (UMinho) DMS October 21, 2010 46 / 64

slide-108
SLIDE 108

Direct search for multiobjective

Analysis of DMS

Assumption {xk}k∈K refining subsequence converging to x∗. F Lipschitz continuous near x∗. Theorem If v is a refining direction for x∗ then ∃j = j(d) : f◦

j (x∗; d) ≥ 0.

A.I.F. Vaz (UMinho) DMS October 21, 2010 46 / 64

slide-109
SLIDE 109

Direct search for multiobjective

Analysis of DMS

Theorem If the set of refining directions for x∗ is dense in Rn, then x∗ is a Pareto-Clarke critical point. Notes When m = 1, we obtain the result presented before. This convergence analysis is valid for multiobjective problems with general nonlinear constraints.

A.I.F. Vaz (UMinho) DMS October 21, 2010 47 / 64

slide-110
SLIDE 110

Direct search for multiobjective

Analysis of DMS

Theorem If the set of refining directions for x∗ is dense in Rn, then x∗ is a Pareto-Clarke critical point. Notes When m = 1, we obtain the result presented before. This convergence analysis is valid for multiobjective problems with general nonlinear constraints.

A.I.F. Vaz (UMinho) DMS October 21, 2010 47 / 64

slide-111
SLIDE 111

Direct search for multiobjective

Analysis of DMS

Theorem If the set of refining directions for x∗ is dense in Rn, then x∗ is a Pareto-Clarke critical point. Notes When m = 1, we obtain the result presented before. This convergence analysis is valid for multiobjective problems with general nonlinear constraints.

A.I.F. Vaz (UMinho) DMS October 21, 2010 47 / 64

slide-112
SLIDE 112

Direct search for multiobjective

Analysis of DMS

Theorem If the set of refining directions for x∗ is dense in Rn, then x∗ is a Pareto-Clarke critical point. Notes When m = 1, we obtain the result presented before. This convergence analysis is valid for multiobjective problems with general nonlinear constraints.

A.I.F. Vaz (UMinho) DMS October 21, 2010 47 / 64

slide-113
SLIDE 113

Numerical results

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 48 / 64

slide-114
SLIDE 114

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on Mesh Adaptive Direct Search. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (UMinho) DMS October 21, 2010 49 / 64

slide-115
SLIDE 115

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on Mesh Adaptive Direct Search. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (UMinho) DMS October 21, 2010 49 / 64

slide-116
SLIDE 116

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on Mesh Adaptive Direct Search. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (UMinho) DMS October 21, 2010 49 / 64

slide-117
SLIDE 117

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on Mesh Adaptive Direct Search. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (UMinho) DMS October 21, 2010 49 / 64

slide-118
SLIDE 118

Numerical results

Numerical testing framework

Problems 100 bound constrained MOO problems (AMPL models available at http://www.mat.uc.pt/dms). Number of variables between 1 and 30. Number of objectives between 2 and 4. Solvers DMS tested against 8 different MOO solvers (complete results available at http://www.mat.uc.pt/dms). Results reported only for AMOSA – simulated annealing code. BIMADS – based on Mesh Adaptive Direct Search. NSGA-II (C version) – genetic algorithm code. All solvers tested with default values.

A.I.F. Vaz (UMinho) DMS October 21, 2010 49 / 64

slide-119
SLIDE 119

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (UMinho) DMS October 21, 2010 50 / 64

slide-120
SLIDE 120

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (UMinho) DMS October 21, 2010 50 / 64

slide-121
SLIDE 121

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (UMinho) DMS October 21, 2010 50 / 64

slide-122
SLIDE 122

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (UMinho) DMS October 21, 2010 50 / 64

slide-123
SLIDE 123

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (UMinho) DMS October 21, 2010 50 / 64

slide-124
SLIDE 124

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (UMinho) DMS October 21, 2010 50 / 64

slide-125
SLIDE 125

Numerical results

DMS numerical options

No search step. List initialization: sample along the line ℓ–u. List selection: all current nondominated points. List ordering: new points added at the end of the list, poll center moved to the end of the list. Positive basis: [I − I]. Step size parameter: α0 = 1, halved at unsuccessful iterations. Stopping criteria: minimum step size of 10−3 or a maximum of 20000 function evaluations.

A.I.F. Vaz (UMinho) DMS October 21, 2010 50 / 64

slide-126
SLIDE 126

Numerical results

Performance metrics — Purity

Fp,s (approximated Pareto front computed by solver s for problem p). Fp (approximated Pareto front computed for problem p, using results for all solvers). Purity value for solver s on problem p: |Fp,s ∩ Fp| |Fp,s| .

A.I.F. Vaz (UMinho) DMS October 21, 2010 51 / 64

slide-127
SLIDE 127

Numerical results

Performance profiles [Dolan and Moré]

Let tp,s be a metric for which lower values indicate better performance. Consider ρs(τ) = |{p ∈ P : rp,s ≤ τ}| |P| with rp,s = tp,s/ min{tp,s : s ∈ S}, where S is the set of solvers and P is the set of problems. Incorporates results for all problems and all solvers. Allows to access ‘efficiency’ and robustness. ρs(1) represents ‘efficiency’ of solver s. ρs(τ), with τ large, gives robustness of solver s. The lower the value tp,s the better.

A.I.F. Vaz (UMinho) DMS October 21, 2010 52 / 64

slide-128
SLIDE 128

Numerical results

Performance profiles [Dolan and Moré]

Let tp,s be a metric for which lower values indicate better performance. Consider ρs(τ) = |{p ∈ P : rp,s ≤ τ}| |P| with rp,s = tp,s/ min{tp,s : s ∈ S}, where S is the set of solvers and P is the set of problems. Incorporates results for all problems and all solvers. Allows to access ‘efficiency’ and robustness. ρs(1) represents ‘efficiency’ of solver s. ρs(τ), with τ large, gives robustness of solver s. The lower the value tp,s the better.

A.I.F. Vaz (UMinho) DMS October 21, 2010 52 / 64

slide-129
SLIDE 129

Numerical results

Performance profiles [Dolan and Moré]

Let tp,s be a metric for which lower values indicate better performance. Consider ρs(τ) = |{p ∈ P : rp,s ≤ τ}| |P| with rp,s = tp,s/ min{tp,s : s ∈ S}, where S is the set of solvers and P is the set of problems. Incorporates results for all problems and all solvers. Allows to access ‘efficiency’ and robustness. ρs(1) represents ‘efficiency’ of solver s. ρs(τ), with τ large, gives robustness of solver s. The lower the value tp,s the better.

A.I.F. Vaz (UMinho) DMS October 21, 2010 52 / 64

slide-130
SLIDE 130

Numerical results

Performance profiles [Dolan and Moré]

Let tp,s be a metric for which lower values indicate better performance. Consider ρs(τ) = |{p ∈ P : rp,s ≤ τ}| |P| with rp,s = tp,s/ min{tp,s : s ∈ S}, where S is the set of solvers and P is the set of problems. Incorporates results for all problems and all solvers. Allows to access ‘efficiency’ and robustness. ρs(1) represents ‘efficiency’ of solver s. ρs(τ), with τ large, gives robustness of solver s. The lower the value tp,s the better.

A.I.F. Vaz (UMinho) DMS October 21, 2010 52 / 64

slide-131
SLIDE 131

Numerical results

Performance profiles [Dolan and Moré]

Let tp,s be a metric for which lower values indicate better performance. Consider ρs(τ) = |{p ∈ P : rp,s ≤ τ}| |P| with rp,s = tp,s/ min{tp,s : s ∈ S}, where S is the set of solvers and P is the set of problems. Incorporates results for all problems and all solvers. Allows to access ‘efficiency’ and robustness. ρs(1) represents ‘efficiency’ of solver s. ρs(τ), with τ large, gives robustness of solver s. The lower the value tp,s the better.

A.I.F. Vaz (UMinho) DMS October 21, 2010 52 / 64

slide-132
SLIDE 132

Numerical results

Performance profiles [Dolan and Moré]

Let tp,s be a metric for which lower values indicate better performance. Consider ρs(τ) = |{p ∈ P : rp,s ≤ τ}| |P| with rp,s = tp,s/ min{tp,s : s ∈ S}, where S is the set of solvers and P is the set of problems. Incorporates results for all problems and all solvers. Allows to access ‘efficiency’ and robustness. ρs(1) represents ‘efficiency’ of solver s. ρs(τ), with τ large, gives robustness of solver s. The lower the value tp,s the better.

A.I.F. Vaz (UMinho) DMS October 21, 2010 52 / 64

slide-133
SLIDE 133

Numerical results

Comparing DMS to other solvers (Purity)

0.5 1 1.5 2 2.5 3 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Purity performance profile

τ ρ DMS(n,line ) BIMADS 3.5 4 4.5 5 5.5 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Purity Metric (percentage of points generated in the reference Pareto front)

tp,s = |Fp,s| |Fp,s ∩ Fp|

A.I.F. Vaz (UMinho) DMS October 21, 2010 53 / 64

slide-134
SLIDE 134

Numerical results

Comparing DMS to other solvers (Purity)

10 20 30 40 50 60 70 80 90 100 110 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Purity performance profile with the best of 10 runs

τ ρ DMS(n,line ) AMOSA 200 400 600 800 1000 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Purity Metric (percentage of points generated in the reference Pareto front)

tp,s = |Fp,s| |Fp,s ∩ Fp|

A.I.F. Vaz (UMinho) DMS October 21, 2010 54 / 64

slide-135
SLIDE 135

Numerical results

Comparing DMS to other solvers (Purity)

1 2 3 4 5 6 7 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Purity performance profile with the best of 10 runs

τ ρ DMS(n,line ) NSGA-II (C ve rsion) 20 40 60 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Purity Metric (percentage of points generated in the reference Pareto front)

tp,s = |Fp,s| |Fp,s ∩ Fp|

A.I.F. Vaz (UMinho) DMS October 21, 2010 55 / 64

slide-136
SLIDE 136

Numerical results

Performance metrics — Spread

Gamma Metric (largest gap in the Pareto front) Γp,s = maxi∈{0,...,N}{di} Delta Metric (uniformity of gaps in the Pareto front) ∆p,s =

d0+dN+ N−1

i=1 |di− ¯

d| d0+dN+(N−1) ¯ d

where ¯ d is the di average

f1 f2 dN Computed extreme points d0 d1 d2 dN−2 dN−1 Obtained points A.I.F. Vaz (UMinho) DMS October 21, 2010 56 / 64

slide-137
SLIDE 137

Numerical results

Comparing DMS to other solvers (Spread)

20 40 60 80 100 120 140 160 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Average Γ performance profile for 10 runs

τ ρ DMS(n,line ) BIMADS NSGA-II (C ve rsion) AMOSA 500 1000 1500 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Gamma Metric (largest gap in the Pareto front)

Γp,s = maxi∈{0,...,N}{di}

A.I.F. Vaz (UMinho) DMS October 21, 2010 57 / 64

slide-138
SLIDE 138

Numerical results

Comparing DMS to other solvers (Spread)

2 4 6 8 10 12 14 16 18 20 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Average Δ performance profile for 10 runs

τ ρ DMS(n,line ) BIMADS NSGA-II (C ve rsion) AMOSA 200 400 600 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 τ ρ

Delta Metric (uniformity of gaps in the Pareto front)

∆p,s =

d0+dN+ N−1

i=1 |di− ¯

d| d0+dN+(N−1) ¯ d

A.I.F. Vaz (UMinho) DMS October 21, 2010 58 / 64

slide-139
SLIDE 139

Numerical results

Data profiles [Moré and Wild]

Indicate how likely is an algorithm to ‘solve’ a problem, given some computational budget. Let hp,s be the number of function evaluations required for solver s to solve problem p. Consider ds(σ) = |{p ∈ P : hp,s ≤ σ}| |P| . Problem solved to ǫ–accuracy: |Fp,s ∩ Fp| |Fp|/|S| ≥ 1 − ε.

A.I.F. Vaz (UMinho) DMS October 21, 2010 59 / 64

slide-140
SLIDE 140

Numerical results

Comparing DMS to other solvers

500 1000 1500 2000 2500 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Data profile with the best of 10 runs (ε=0.1)

σ ds(σ) DMS(n,line ) BIMADS NSGA-II (C ve rsion) AMOSA

# maximum function evaluations = 5000

A.I.F. Vaz (UMinho) DMS October 21, 2010 60 / 64

slide-141
SLIDE 141

Numerical results

Comparing DMS to other solvers

500 1000 1500 2000 2500 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Data profile with the best of 10 runs (ε=0.5)

σ ds(σ) DMS(n,line ) BIMADS NSGA-II (C ve rsion) AMOSA

# maximum function evaluations = 5000

A.I.F. Vaz (UMinho) DMS October 21, 2010 61 / 64

slide-142
SLIDE 142

Conclusions and references

Outline

1

Introduction

2

Direct search

3

Direct search for single objective

4

Direct search for multiobjective

5

Numerical results

6

Conclusions and references

A.I.F. Vaz (UMinho) DMS October 21, 2010 62 / 64

slide-143
SLIDE 143

Conclusions and references

Conclusions and references

Development and analysis of a novel approach (Direct MultiSearch) for MOO, generalizing ALL direct-search methods. Direct MultiSearch (DMS) exhibits highly competitive numerical results for MOO. DMS (Matlab implementation) and problems (coded in AMPL) freely available at: http://www.mat.uc.pt/dms.

  • A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct

multisearch for multiobjective optimization, preprint 10-18, Dept. of Mathematics, Univ. Coimbra, 2010.

A.I.F. Vaz (UMinho) DMS October 21, 2010 63 / 64

slide-144
SLIDE 144

Conclusions and references

Conclusions and references

Development and analysis of a novel approach (Direct MultiSearch) for MOO, generalizing ALL direct-search methods. Direct MultiSearch (DMS) exhibits highly competitive numerical results for MOO. DMS (Matlab implementation) and problems (coded in AMPL) freely available at: http://www.mat.uc.pt/dms.

  • A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct

multisearch for multiobjective optimization, preprint 10-18, Dept. of Mathematics, Univ. Coimbra, 2010.

A.I.F. Vaz (UMinho) DMS October 21, 2010 63 / 64

slide-145
SLIDE 145

Conclusions and references

Conclusions and references

Development and analysis of a novel approach (Direct MultiSearch) for MOO, generalizing ALL direct-search methods. Direct MultiSearch (DMS) exhibits highly competitive numerical results for MOO. DMS (Matlab implementation) and problems (coded in AMPL) freely available at: http://www.mat.uc.pt/dms.

  • A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct

multisearch for multiobjective optimization, preprint 10-18, Dept. of Mathematics, Univ. Coimbra, 2010.

A.I.F. Vaz (UMinho) DMS October 21, 2010 63 / 64

slide-146
SLIDE 146

Conclusions and references

Conclusions and references

Development and analysis of a novel approach (Direct MultiSearch) for MOO, generalizing ALL direct-search methods. Direct MultiSearch (DMS) exhibits highly competitive numerical results for MOO. DMS (Matlab implementation) and problems (coded in AMPL) freely available at: http://www.mat.uc.pt/dms.

  • A. L. Custódio, J. F. A. Madeira, A. I. F. Vaz, and L. N. Vicente, Direct

multisearch for multiobjective optimization, preprint 10-18, Dept. of Mathematics, Univ. Coimbra, 2010.

A.I.F. Vaz (UMinho) DMS October 21, 2010 63 / 64

slide-147
SLIDE 147

Conclusions and references

Optimization 2011 (July 24–27, Portugal)

A.I.F. Vaz (UMinho) DMS October 21, 2010 64 / 64