Evolutionary Multiobjective Optimization: Current and Future - - PowerPoint PPT Presentation

evolutionary multiobjective optimization current and
SMART_READER_LITE
LIVE PREVIEW

Evolutionary Multiobjective Optimization: Current and Future - - PowerPoint PPT Presentation

Evolutionary Multiobjective Optimization: Current and Future Challenges Carlos A. Coello Coello CINVESTAV-IPN Depto. de Ingenier a El ectrica Secci on de Computaci on Av. Instituto Polit ecnico Nacional No. 2508 Col. San


slide-1
SLIDE 1

Evolutionary Multiobjective Optimization: Current and Future Challenges

Carlos A. Coello Coello CINVESTAV-IPN

  • Depto. de Ingenier´

ıa El´ ectrica Secci´

  • n de Computaci´
  • n
  • Av. Instituto Polit´

ecnico Nacional No. 2508

  • Col. San Pedro Zacatenco

M´ exico, D. F. 07300, MEXICO

ccoello@cs.cinvestav.mx

1

slide-2
SLIDE 2

Motivation

Most problems in nature have several (possibly conflicting)

  • bjectives to be satisfied. Many of these problems are frequently

treated as single-objective optimization problems by transforming all but one objective into constraints.

2

slide-3
SLIDE 3

What is a multiobjective optimization problem?

The Multiobjective Optimization Problem (MOP) (also called multicriteria optimization, multiperformance or vector

  • ptimization problem) can be defined (in words) as the problem of

finding (Osyczka, 1985): a vector of decision variables which satisfies constraints and

  • ptimizes a vector function whose elements represent the
  • bjective functions. These functions form a mathematical

description of performance criteria which are usually in conflict with each other. Hence, the term “optimize” means finding such a solution which would give the values of all the objective functions acceptable to the decision maker.

3

slide-4
SLIDE 4

A Formal Definition

The general Multiobjective Optimization Problem (MOP) can be formally defined as: Find the vector x∗ = [x∗

1, x∗ 2, . . . , x∗ n]T which will satisfy the m

inequality constraints: gi( x) ≥ 0 i = 1, 2, . . . , m (1) the p equality constraints hi( x) = 0 i = 1, 2, . . . , p (2) and will optimize the vector function

  • f(

x) = [f1( x), f2( x), . . . , fk( x)]T (3)

4

slide-5
SLIDE 5

What is the notion of optimum in multiobjective optimization?

Having several objective functions, the notion of “optimum” changes, because in MOPs, we are really trying to find good compromises (or “trade-offs”) rather than a single solution as in global optimization. The notion of “optimum” that is most commonly adopted is that originally proposed by Francis Ysidro Edgeworth in 1881.

5

slide-6
SLIDE 6

What is the notion of optimum in multiobjective optimization?

This notion was later generalized by Vilfredo Pareto (in 1896). Although some authors call Edgeworth-Pareto optimum to this notion, we will use the most commonly accepted term: Pareto

  • ptimum.

6

slide-7
SLIDE 7

Definition of Pareto Optimality

We say that a vector of decision variables x∗ ∈ F is Pareto optimal if there does not exist another x ∈ F such that fi( x) ≤ fi( x∗) for all i = 1, . . . , k and fj( x) < fj( x∗) for at least one j.

7

slide-8
SLIDE 8

Definition of Pareto Optimality

In words, this definition says that x∗ is Pareto optimal if there exists no feasible vector of decision variables x ∈ F which would decrease some criterion without causing a simultaneous increase in at least one other criterion. Unfortunately, this concept almost always gives not a single solution, but rather a set of solutions called the Pareto optimal set. The vectors x∗ correspoding to the solutions included in the Pareto optimal set are called

  • nondominated. The plot of the objective functions whose

nondominated vectors are in the Pareto optimal set is called the Pareto front.

8

slide-9
SLIDE 9

Sample Pareto Front

0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 F2 F1

9

slide-10
SLIDE 10

Origins of EMOO

The first actual implementation of what it is now called a multi-objective evolutionary algorithm (or MOEA, for short) was Schaffer’s Vector Evaluation Genetic Algorithm (VEGA), which was introduced in the mid-1980s, mainly aimed for solving problems in machine learning (Schaffer, 1985).

10

slide-11
SLIDE 11

VEGA

2 1 n . . . gene performance parents Generation(t) Generation(t+1) select n subgroups using each dimension of performance in turn popsize 1 shuffle apply genetic

  • perators

popsize 1 STEP STEP STEP 1 2 3 . . . . . . 1 . . . 2 n

Figure 1: Schematic of VEGA selection.

11

slide-12
SLIDE 12

Early Algorithms

From the second half of the 1980s up to the first half of the 1990s, few

  • ther researchers developed MOEAs. Most of the work reported back

then involves the following types of MOEAs:

  • Aggregating functions (mainly linear)
  • Lexicographic ordering
  • Target-vector approaches

Such approaches indicated strong roots in Operations Research. Note however, that these roots dissapeared over time. Algorithms remained relatively simple to implement during these early days of MOEAs.

12

slide-13
SLIDE 13

MOEAs: First Generation

The major step towards the first generation of MOEAs was given by David Goldberg (in 1989) when he proposed a selection scheme based on the concept of Pareto optimality (now called Pareto ranking). He also proposed the use of fitness sharing and niching to maintain diversity (something necessary to avoid converging to a single solution by effect of stochastic noise).

13

slide-14
SLIDE 14

MOEAs: First Generation

The most representative MOEAs of the first generation are the following:

  • Nondominated Sorting Genetic Algorithm (NSGA) (Srinivas

& Deb, 1994)

  • Niched-Pareto Genetic Algorithm (NPGA) (Horn et al., 1994)
  • Multi-Objective Genetic Algorithm (MOGA) (Fonseca &

Fleming, 1993)

14

slide-15
SLIDE 15

MOEAs: First Generation

All the previously indicated algorithms use Pareto ranking, but in subtle different ways. For example, the NSGA uses layers to classify individuals, whereas MOGA ranked using the whole population at the same time. The NPGA, by its side, used tournaments based on

  • nondominance. Although there is no theoretical study that indicates

advantages or disadvantages of any of these ranking schemes, several practitioners reported that MOGA seemed to be the most efficient MOEA of the first generation.

15

slide-16
SLIDE 16

The Main Questions of the First Generation

The main questions raised during the first generation were:

  • Are aggregating functions (so common before and even during the

golden years of Pareto ranking) really doomed to fail when the Pareto front is non-convex?

  • Can we find ways to maintain diversity in the population without

using niches (or fitness sharing), which requires a process O(M 2) where M refers to the population size?

16

slide-17
SLIDE 17

The Main Questions of the First Generation

  • If we assume that there is no way of reducing the O(kM 2) process

required to perform Pareto ranking (k is the number of objectives and M is the population size), how can we design a more efficient MOEA?

  • Do we have appropriate test functions and metrics to evaluate

quantitatively an MOEA?

  • When will somebody develop theoretical foundations for MOEAs?

17

slide-18
SLIDE 18

MOEAs: Second Generation

The second generation of MOEAs was born with the introduction of the notion of elitism. In the context of multiobjective optimization, elitism usually (although not necessarily) refers to the use of an external population (also called secondary population) to retain the nondominated individuals found along the evolutionary process.

18

slide-19
SLIDE 19

MOEAs: Second Generation

The use of this external population (or file) raises several questions:

  • How does the external file interact with the main population?
  • What do we do when the external file is full?
  • Do we impose additional criteria to enter the file instead of just

using Pareto dominance?

19

slide-20
SLIDE 20

MOEAs: Second Generation

Elitism can also be introduced through the use of a (µ + λ)-selection in which parents compete with their children and those which are nondominated (and possibly comply with some additional criterion such as providing a better distribution of solutions) are selected for the following generation.

20

slide-21
SLIDE 21

MOEAs: Second Generation

Some of the most representative MOEAs of the second generation are the following:

  • Strength Pareto Evolutionary Algorithm (SPEA) (Zitzler &

Thiele, 1999)

  • Strength Pareto Evolutionary Algorithm 2 (SPEA2) (Zitzler

et al., 2001)

  • Pareto Archived Evolution Strategy (PAES) (Knowles &

Corne, 2000)

21

slide-22
SLIDE 22

MOEAs: Second Generation

  • Nondominated Sorting Genetic Algorithm II (NSGA-II) (Deb

et al., 2000)

  • Niched Pareto Genetic Algorithm 2 (NPGA 2) (Erickson et al.,

2001)

  • Pareto Envelope-based Selection Algorithm (PESA) (Corne et

al., 2000)

  • Micro Genetic Algorithm for Multiobjective Optimization

(microGA) (Toscano & Coello, 2001)

22

slide-23
SLIDE 23

MOEAs: Second Generation

Second generation MOEAs can be characterized by an emphasis on efficiency and by the use of elitism (in the two main forms previously described). During the second generation, some important theoretical work also took place, mainly related to convergence. Also, metrics and standard test functions were developed to validate new MOEAs.

23

slide-24
SLIDE 24

The Main Questions of the Second Generation

  • Are our metrics reliable? What about our test functions?
  • Are we ready to tackle problems with more than two objective

functions efficiently? Is Pareto ranking doomed to fail when dealing with too many objectives? If so, then what is the limit up to which Pareto ranking can be used to select individuals reliably?

  • What are the most relevant theoretical aspects of evolutionary

multiobjective optimization that are worth exploring in the short-term?

24

slide-25
SLIDE 25

Analysis of the Literature

83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 00 01 02 20 40 60 80 100 120 140 160 Publication Year Number of Publications

Figure 2: MOEA Citations by Year (up to mid-2002)

25

slide-26
SLIDE 26

Types of Applications

Eng Ind Sci Misc 50 100 150 200 250 300 Application Area Number of Applications Reviewed

Figure 3: The labels used are the following: Eng = Engineering, Ind =

Industrial, Sci = Scientific, Misc = Miscellaneous.

26

slide-27
SLIDE 27

Engineering Applications

ENH EE Tel RC SM CC Tra A 10 20 30 40 50 60 Engineering Field Number of Applications Reviewed

Figure 4: ENH = Environmental, Naval, and Hydraulic, EE = Electrical

and Electronics, Tel = Telecommunications and Network Optimization, RC = Robotics and Control, SM = Structural & Mechanical, CC = Civil and Construction, Tra = Transport, A = Aeronautical.

27

slide-28
SLIDE 28

Scientific Applications

GE CH PH MD EC CSE 5 10 15 20 25 30 35 40 45 50 Scientific Field Number of Applications Reviewed

Figure 5: The following labels are used: GE = Geography, CH = Chem-

istry, PH = Physics, MD = Medicine, EC = Ecology, CSE = Computer Science and Computer Engineering.

28

slide-29
SLIDE 29

Industrial Applications

DM SC GR MA 5 10 15 20 25 30 35 40 45 50 Type of Industrial Application Number of Applications Reviewed

Figure 6: The following labels are used: DM = Design and Manufacture,

SC = Scheduling, GR = Grouping and Packing, MA = Management.

29

slide-30
SLIDE 30

Miscellaneous Applications

FI EC MI CP 5 10 15 Type of Miscellaneous Application Number of Applications Reviewed

Figure 7: FI = Finance, CP = Classification and Prediction.

30

slide-31
SLIDE 31

Future Challenges

  • Incorporation of preferences in MOEAs
  • Dynamic Test Functions
  • Highly-Constrained Search Spaces
  • Parallelism
  • Theoretical Foundations
  • Use of More Efficient Data Structures

31

slide-32
SLIDE 32

To know more about evolutionary multiobjective optimization

Please visit our EMOO repository located at: http://delta.cs.cinvestav.mx/˜ccoello/EMOO with mirrors at: http://www.jeo.org/emo and: http://www.lania.mx/˜ccoello/EMOO

32

slide-33
SLIDE 33

To know more about evolutionary multiobjective optimization

33

slide-34
SLIDE 34

To know more about evolutionary multiobjective optimization

The EMOO repository currently contains:

  • Over 1000 bibliographic references including 38 PhD theses
  • Contact info of about 50 EMOO researchers
  • Public domain implementations of MOGA (with elitism),

SPEA, NSGA, NSGA-II, the microGA, and PAES, among

  • thers.

34

slide-35
SLIDE 35

To know more about evolutionary multiobjective optimization

You can consult the following book recently published: Carlos A. Coello Coello, David A. Van Veldhuizen and Gary B. Lamont, Evolutionary Algorithms for Solving Multi-Objective Problems, Kluwer Academic Publishers, New York, May 2002, ISBN 0-3064-6762-3.

35