evolutionary multiobjective optimization current and
play

Evolutionary Multiobjective Optimization: Current and Future - PowerPoint PPT Presentation

Evolutionary Multiobjective Optimization: Current and Future Challenges Carlos A. Coello Coello CINVESTAV-IPN Depto. de Ingenier a El ectrica Secci on de Computaci on Av. Instituto Polit ecnico Nacional No. 2508 Col. San


  1. Evolutionary Multiobjective Optimization: Current and Future Challenges Carlos A. Coello Coello CINVESTAV-IPN Depto. de Ingenier´ ıa El´ ectrica Secci´ on de Computaci´ on Av. Instituto Polit´ ecnico Nacional No. 2508 Col. San Pedro Zacatenco M´ exico, D. F. 07300, MEXICO ccoello@cs.cinvestav.mx 1

  2. Motivation Most problems in nature have several (possibly conflicting) objectives to be satisfied. Many of these problems are frequently treated as single-objective optimization problems by transforming all but one objective into constraints. 2

  3. What is a multiobjective optimization problem? The Multiobjective Optimization Problem (MOP) (also called multicriteria optimization, multiperformance or vector optimization problem) can be defined (in words) as the problem of finding (Osyczka, 1985): a vector of decision variables which satisfies constraints and optimizes a vector function whose elements represent the objective functions. These functions form a mathematical description of performance criteria which are usually in conflict with each other. Hence, the term “optimize” means finding such a solution which would give the values of all the objective functions acceptable to the decision maker. 3

  4. A Formal Definition The general Multiobjective Optimization Problem (MOP) can be formally defined as: n ] T which will satisfy the m x ∗ = [ x ∗ Find the vector � 1 , x ∗ 2 , . . . , x ∗ inequality constraints: g i ( � x ) ≥ 0 i = 1 , 2 , . . . , m (1) the p equality constraints h i ( � x ) = 0 i = 1 , 2 , . . . , p (2) and will optimize the vector function � x )] T f ( � x ) = [ f 1 ( � x ) , f 2 ( � x ) , . . . , f k ( � (3) 4

  5. What is the notion of optimum in multiobjective optimization? Having several objective functions, the notion of “optimum” changes, because in MOPs, we are really trying to find good compromises (or “trade-offs”) rather than a single solution as in global optimization. The notion of “optimum” that is most commonly adopted is that originally proposed by Francis Ysidro Edgeworth in 1881. 5

  6. What is the notion of optimum in multiobjective optimization? This notion was later generalized by Vilfredo Pareto (in 1896). Although some authors call Edgeworth-Pareto optimum to this notion, we will use the most commonly accepted term: Pareto optimum . 6

  7. Definition of Pareto Optimality x ∗ ∈ F is Pareto optimal We say that a vector of decision variables � if there does not exist another � x ∈ F such that f i ( � x ) ≤ f i ( � x ∗ ) for all i = 1 , . . . , k and f j ( � x ) < f j ( � x ∗ ) for at least one j . 7

  8. Definition of Pareto Optimality x ∗ is Pareto optimal if there In words, this definition says that � exists no feasible vector of decision variables � x ∈ F which would decrease some criterion without causing a simultaneous increase in at least one other criterion. Unfortunately, this concept almost always gives not a single solution, but rather a set of solutions x ∗ correspoding to the called the Pareto optimal set . The vectors � solutions included in the Pareto optimal set are called nondominated . The plot of the objective functions whose nondominated vectors are in the Pareto optimal set is called the Pareto front . 8

  9. Sample Pareto Front 1 0.9 0.8 0.7 0.6 0.5 F2 0.4 0.3 0.2 0.1 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 F1 9

  10. Origins of EMOO The first actual implementation of what it is now called a multi-objective evolutionary algorithm (or MOEA, for short) was Schaffer’s Vector Evaluation Genetic Algorithm (VEGA), which was introduced in the mid-1980s, mainly aimed for solving problems in machine learning (Schaffer, 1985). 10

  11. VEGA gene performance 1 2 . . . n Generation(t) parents Generation(t+1) 1 1 1 . . STEP 1 2 STEP 2 STEP 3 . . . . . select n apply genetic shuffle subgroups . operators using each . dimension of performance in turn n popsize popsize Figure 1: Schematic of VEGA selection. 11

  12. Early Algorithms From the second half of the 1980s up to the first half of the 1990s, few other researchers developed MOEAs. Most of the work reported back then involves the following types of MOEAs: • Aggregating functions (mainly linear) • Lexicographic ordering • Target-vector approaches Such approaches indicated strong roots in Operations Research. Note however, that these roots dissapeared over time. Algorithms remained relatively simple to implement during these early days of MOEAs. 12

  13. MOEAs: First Generation The major step towards the first generation of MOEAs was given by David Goldberg (in 1989) when he proposed a selection scheme based on the concept of Pareto optimality (now called Pareto ranking). He also proposed the use of fitness sharing and niching to maintain diversity (something necessary to avoid converging to a single solution by effect of stochastic noise). 13

  14. MOEAs: First Generation The most representative MOEAs of the first generation are the following: • Nondominated Sorting Genetic Algorithm (NSGA) (Srinivas & Deb, 1994) • Niched-Pareto Genetic Algorithm (NPGA) (Horn et al., 1994) • Multi-Objective Genetic Algorithm (MOGA) (Fonseca & Fleming, 1993) 14

  15. MOEAs: First Generation All the previously indicated algorithms use Pareto ranking, but in subtle different ways. For example, the NSGA uses layers to classify individuals, whereas MOGA ranked using the whole population at the same time. The NPGA, by its side, used tournaments based on nondominance. Although there is no theoretical study that indicates advantages or disadvantages of any of these ranking schemes, several practitioners reported that MOGA seemed to be the most efficient MOEA of the first generation. 15

  16. The Main Questions of the First Generation The main questions raised during the first generation were: • Are aggregating functions (so common before and even during the golden years of Pareto ranking) really doomed to fail when the Pareto front is non-convex? • Can we find ways to maintain diversity in the population without using niches (or fitness sharing), which requires a process O ( M 2 ) where M refers to the population size? 16

  17. The Main Questions of the First Generation • If we assume that there is no way of reducing the O ( kM 2 ) process required to perform Pareto ranking ( k is the number of objectives and M is the population size), how can we design a more efficient MOEA? • Do we have appropriate test functions and metrics to evaluate quantitatively an MOEA? • When will somebody develop theoretical foundations for MOEAs? 17

  18. MOEAs: Second Generation The second generation of MOEAs was born with the introduction of the notion of elitism. In the context of multiobjective optimization, elitism usually (although not necessarily) refers to the use of an external population (also called secondary population) to retain the nondominated individuals found along the evolutionary process. 18

  19. MOEAs: Second Generation The use of this external population (or file) raises several questions: • How does the external file interact with the main population? • What do we do when the external file is full? • Do we impose additional criteria to enter the file instead of just using Pareto dominance? 19

  20. MOEAs: Second Generation Elitism can also be introduced through the use of a ( µ + λ )-selection in which parents compete with their children and those which are nondominated (and possibly comply with some additional criterion such as providing a better distribution of solutions) are selected for the following generation. 20

  21. MOEAs: Second Generation Some of the most representative MOEAs of the second generation are the following: • Strength Pareto Evolutionary Algorithm (SPEA) (Zitzler & Thiele, 1999) • Strength Pareto Evolutionary Algorithm 2 (SPEA2) (Zitzler et al., 2001) • Pareto Archived Evolution Strategy (PAES) (Knowles & Corne, 2000) 21

  22. MOEAs: Second Generation • Nondominated Sorting Genetic Algorithm II (NSGA-II) (Deb et al., 2000) • Niched Pareto Genetic Algorithm 2 (NPGA 2) (Erickson et al., 2001) • Pareto Envelope-based Selection Algorithm (PESA) (Corne et al., 2000) • Micro Genetic Algorithm for Multiobjective Optimization (microGA) (Toscano & Coello, 2001) 22

  23. MOEAs: Second Generation Second generation MOEAs can be characterized by an emphasis on efficiency and by the use of elitism (in the two main forms previously described). During the second generation, some important theoretical work also took place, mainly related to convergence. Also, metrics and standard test functions were developed to validate new MOEAs. 23

  24. The Main Questions of the Second Generation • Are our metrics reliable? What about our test functions? • Are we ready to tackle problems with more than two objective functions efficiently? Is Pareto ranking doomed to fail when dealing with too many objectives? If so, then what is the limit up to which Pareto ranking can be used to select individuals reliably? • What are the most relevant theoretical aspects of evolutionary multiobjective optimization that are worth exploring in the short-term? 24

  25. Analysis of the Literature 160 140 120 Number of Publications 100 80 60 40 20 0 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 00 01 02 Publication Year Figure 2: MOEA Citations by Year (up to mid-2002) 25

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend