Tuning Optimization Algorithms under Multiple Objective Function - - PowerPoint PPT Presentation

tuning optimization algorithms under multiple objective
SMART_READER_LITE
LIVE PREVIEW

Tuning Optimization Algorithms under Multiple Objective Function - - PowerPoint PPT Presentation

Tuning Optimization Algorithms under Multiple Objective Function Evaluation Budgets Antoine S. Dymond Department of Mechanical and Aeronautical Engineering University of Pretoria, South Africa Departmental Public Defence 25 July 2014 A.


slide-1
SLIDE 1

Tuning Optimization Algorithms under Multiple Objective Function Evaluation Budgets

Antoine S. Dymond

Department of Mechanical and Aeronautical Engineering University of Pretoria, South Africa Departmental Public Defence

25 July 2014

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 1 / 20

slide-2
SLIDE 2

Introduction

Numerical optimization

Numerical optimization forms a pivotal part of many design processes. 3 parts: Modeling Searching for the optimum of the generated model Validation Focus of PhD : optimization algorithms for searching for the model’s

  • ptimum
  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 2 / 20

slide-3
SLIDE 3

Introduction

Optimization considerations

No one optimization algorithm works for all problems (No Free Lunch [4]). Algorithm and CPVs need to be selected according to

  • bjective function characteristics

constraints imposed termination criteria - objective function evaluation (OFE) budgets.

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 3 / 20

slide-4
SLIDE 4

Introduction

Goal and Contributions

Goal Aid Practitioners in selecting an optimization algorithm and CPVs appropriate for the problem at hand. Approach present tools for determining algorithms and CPVs well suited for representative1 testing problems. Contributions

1 tMOPSO 2 MOTA 3 Benchmarking via tunability 1suspected of being representative...

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 4 / 20

slide-5
SLIDE 5

tMOPSO

tuning multi-objective particle swarm optimization algorithm

Tuning entails changing an algorithm settings (CPVs) as to improve performance. tMOPSO tunes an algorithm to a single criteria for Multiple OFEs: best CPVs OFE budget N F Cr 20 20 0.7 0.2 50 5 0.5 0.1 100 10 0.9 0.4 . . . . . . . . . . . .

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 5 / 20

slide-6
SLIDE 6

tMOPSO

Multi-objective optimization

minimize F(x) =     f1(x) f2(x) . . . fnf (x)     Pareto dominance: x1 ≺ x2 when: fk(x1) ≤ fk(x2), ∀ k ∈ {1, 2, . . . , nf } and ∃k ∈ 1, 2, . . . , nf : fk(x1) < fk(x2).

f1 f2

F(x) ∀ x ∈ X ℜ3 X P

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 6 / 20

slide-7
SLIDE 7

tMOPSO

contribution

tMOPSO Combines into one algorithm: multi-objective tuning according to speed vs. accuracy history information noise handling for tuning stochastic algorithms efficient Pareto archives

  • speed

accuracy a b c

  • speed

accuracy a c

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 7 / 20

slide-8
SLIDE 8

tMOPSO

Gauging tMOPSO

Conducted numerical experiments showed the tMOPSO is better than or equivalent to existing techniques In particular for the conducted experiments tMOPSO outperformed the FBM algorithm tMOPSO was found to be a more efficient alternative then setting up numerous single OFE budget tuning problems

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 8 / 20

slide-9
SLIDE 9

tMOPSO

tMOPSO’s limitations

tMOPSO tunes according to one performance measure under multiple OFE budgets. Can be problematic when tuning to representative testing problems: an average performance measure may result in over-tuning Risk can be greatly reduced through many objective tuning.

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 9 / 20

slide-10
SLIDE 10

MOTA

many objective tuning algorithm

MOTA is designed to tune an algorithm to multiple performance criteria

  • ver multiple OFE budgets

Contribution : Not been done before... Applications tuning to a problem suite, multi-objectively (lower risk of over-tuning) tuning multi-objective algorithms

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 10 / 20

slide-11
SLIDE 11

MOTA

many objective optimization

Many objective = 4 or more objectives Pareto Dominance is not enough: PF size grows exponentially with the number of objectives. MOTA uses bi-objective decomposition. Bi-objective decomposition is well-suite for many objective tuning under multiple OFE budgets: history information efficient Pareto operations

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 11 / 20

slide-12
SLIDE 12

MOTA

Numerical experiments

Conducted Numerical experiments consisted of tuning NSGA-II [1] MOEA/D [5] MOTA’s design is successful: Efficient at many objective tuning Built from the ground up as an many objective tuning algorithm!

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 12 / 20

slide-13
SLIDE 13

Which algorithm to tune?

MOTA and tMOPSO are for methods for tuning a selected algorithm. Question which algorithm to tune?

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 13 / 20

slide-14
SLIDE 14

Benchmarking via tunability

Motivation

Normal practice: benchmark algorithms using default CPVs on a standard test suite(s) Issue CPVs choices available to the practitioner not incorporated. CPVs are important: they allow application to a vast range of objective function characteristics, constraints, termination criteria (OFE budgets!).

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 14 / 20

slide-15
SLIDE 15

Benchmarking via tunability

Algorithms versus Parameters

Deterministic search process basic building block. Sensitive to

  • bjective function

constraints termination criteria Algorithm set of search processes unified by a central idea CPVs determine which deterministic process is executed

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 15 / 20

slide-16
SLIDE 16

Benchmarking via tunability

Description

Premis An algorithm is well suited to a problem, if it is easy to find CPVs resulting in favourable performance. Tuning effort must be incorporated!

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 16 / 20

slide-17
SLIDE 17

Benchmarking via tunability

Demonstration

Numerical experiments showed that Benchmarking via tunability effective method: DE [3] versus EGO [2]: Default CPVs EGO better at low OFE budgets DE better at higher OFE budgets

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 17 / 20

slide-18
SLIDE 18

Benchmarking via tunability

Demonstration

Numerical experiments showed that Benchmarking via tunability effective method: DE [3] versus EGO [2]: Default CPVs EGO better at low OFE budgets DE better at higher OFE budgets Benchmarking via tunability OFE budget largely insignificant, EGO better for problems compatible with its surrogate meta-model.

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 17 / 20

slide-19
SLIDE 19

Conclusion

Select an algorithm and CPVs appropriate for the problem at hand: Objective function characteristics Constraints imposed Termination criteria - OFE budget Tuning to testing problems suspected of being representative, can assist in this regard. Contributions tMOPSO, for tuning a single criteria under multiple OFE budgets. MOTA, for tuning to multiple criteria under multiple OFE budgets. Benchmarking via tunability, to help select the algorithm to tune.

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 18 / 20

slide-20
SLIDE 20

Acknowledgements

Family

  • Prof. Heyns and Prof. Kok

GNU/Linux Friends NRF CHPC The giants upon whose shoulders we stand. “The first principle is that you must not fool yourself, and you are the easiest person to fool.” Richard P. Feynman

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 19 / 20

slide-21
SLIDE 21

Bibliography

  • K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan.

A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Transactions on Evolutionary Computation, 6(2):182–197, 2002. Donald R Jones, Matthias Schonlau, and William J Welch. Efficient global optimization of expensive black-box functions. Journal of Global optimization, 13(4):455–492, 1998.

  • R. Storn and K. Price.

Differential evolution - a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization, 11(4):341–359, 1997. D.H. Wolpert and W.G. Macready. No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation, 1(1):67–82, 1997.

  • Q. Zhang and H. Li.

MOEA/D: A multiobjective evolutionary algorithm based on decomposition. IEEE Transactions on Evolutionary Computation, 11(6):712–731, 2007.

  • A. Dymond

Tuning Optimization Algorithms 25 July 2014 20 / 20