recent results and open problems in evolutionary
play

Recent Results and Open Problems in Evolutionary Multiobjective - PowerPoint PPT Presentation

Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Recent Results and Open Problems in Evolutionary Multiobjective Optimization Carlos A. Coello Coello CINVESTAV-IPN Evolutionary Computation


  1. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Recent Results and Open Problems in Evolutionary Multiobjective Optimization Carlos A. Coello Coello CINVESTAV-IPN Evolutionary Computation Group (EVOCINV) Departamento de Computaci´ on Av. IPN No. 2508, Col. San Pedro Zacatenco M´ exico, D.F . 07360, MEXICO ccoello@cs.cinvestav.mx Wellington, New Zealand Carlos A. Coello Coello Recent Results and Open Problems in EMO

  2. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Outline Introduction 1 Recent Results and Open Problems 2 Algorithms MOEAs for Expensive Objective Functions Self-Adaptation and Online Adaptation Scalability The Challenges of this Century 3 Conclusions 4 Carlos A. Coello Coello Recent Results and Open Problems in EMO

  3. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Multi-Objective Evolutionary Algorithms Evolutionary algorithms seem particularly suitable to solve multiobjective optimization problems, because they deal simultaneously with a set of possible solutions (the so-called population). This allows us to find several members of the Pareto optimal set in a single run of the algorithm, instead of having to perform a series of separate runs as in the case of the traditional mathematical programming techniques. Additionally, evolutionary algorithms are less susceptible to the shape or continuity of the Pareto front (e.g., they can easily deal with discontinuous or concave Pareto fronts), whereas these two issues are normally a real concern for mathematical programming techniques. Carlos A. Coello Coello Recent Results and Open Problems in EMO

  4. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Historical Highlights The potential of evolutionary algorithms in multiobjective optimization was hinted by Richard S. Rosenberg in his PhD thesis from 1967, which is entitled “Simulation of genetic populations with biochemical properties”. Carlos A. Coello Coello Recent Results and Open Problems in EMO

  5. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Historical Highlights However, the first actual implementation of a multi-objective evolutionary algorithm was John David Schaffer’s Vector Evaluated Genetic Algorithm (VEGA), which dates back to 1984. This was a naive multi-objective evolutionary algorithm, whose selection mechanism did not rely on Pareto optimality. Carlos A. Coello Coello Recent Results and Open Problems in EMO

  6. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Historical Highlights First Generation From 1985 to 1998, we can identify a first generation of multi-objective evolutionary algorithms (MOEAs) which were non-elitist and relatively naive: Vector Evaluated Genetic Algorithm (VEGA) (1985) Nondominated Sorting Genetic Algorithm (NSGA) (1994) Niched-Pareto Genetic Algorithm (NPGA) (1994) Multi-Objective Genetic Algorithm (MOGA) (1993) Carlos A. Coello Coello Recent Results and Open Problems in EMO

  7. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Historical Highlights Second Generation The second generation brought us a set of elitist MOEAs which were more efficient and effective and had a more elegant design: SPEA and SPEA2 (1999, 2001) NSGA-II (2000,2002) micro-GA for MOO (2001) PAES (2000) PESA, PESA-II (2000, 2001) Many others ... Carlos A. Coello Coello Recent Results and Open Problems in EMO

  8. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Historical Highlights Since Goldberg’s early proposal (1989), MOEAs consist of two basic components: A selection mechanism that normally (but not necessarily) incorporates Pareto optimality. A density estimator, which is responsible for maintaining diversity, and therefore, keeping the MOEA from converging to a single solution. Carlos A. Coello Coello Recent Results and Open Problems in EMO

  9. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Historical Highlights The main density estimators that have been used with MOEAs are: Fitness sharing Clustering Entropy Adaptive grids Crowding Carlos A. Coello Coello Recent Results and Open Problems in EMO

  10. Contents Introduction Recent Results and Open Problems The Challenges of this Century Conclusions Number of papers published per year (up to mid 2015) Carlos A. Coello Coello Recent Results and Open Problems in EMO

  11. Contents Algorithms Introduction MOEAs for Expensive Objective Functions Recent Results and Open Problems Self-Adaptation and Online Adaptation The Challenges of this Century Scalability Conclusions Recent Results and Open Problems Introduction After 30 years of existence, and with so much work done, EMO may seem intimidating to some people. If so many people have worked in this area for the last 15 years, what remains to be done? Carlos A. Coello Coello Recent Results and Open Problems in EMO

  12. Contents Algorithms Introduction MOEAs for Expensive Objective Functions Recent Results and Open Problems Self-Adaptation and Online Adaptation The Challenges of this Century Scalability Conclusions Recent Results and Open Problems Luckily, there still are many opportunities to do research in this area, even within topics that seem to have been visited a lot in the past. Imagination is more important than knowledge. Albert Einstein Carlos A. Coello Coello Recent Results and Open Problems in EMO

  13. Contents Algorithms Introduction MOEAs for Expensive Objective Functions Recent Results and Open Problems Self-Adaptation and Online Adaptation The Challenges of this Century Scalability Conclusions Recent Results and Open Problems Algorithms The main current research trend regarding algorithmic development is to adopt a performance measure in the selection scheme of a MOEA. See for example: ESP : The Evolution Strategy with Probability Mutation uses a hypervolume-based, scaling independent, parameterless measure, to truncate overpopulated external archives (Huband et al., 2003). IBEA : This is a framework that allows any performance indicator to be incorporated into the selection mechanism of a MOEA (Zitzler et al., 2004). Its authors tested it with the hypervolume and with the binary ǫ indicator. Carlos A. Coello Coello Recent Results and Open Problems in EMO

  14. Contents Algorithms Introduction MOEAs for Expensive Objective Functions Recent Results and Open Problems Self-Adaptation and Online Adaptation The Challenges of this Century Scalability Conclusions Recent Results and Open Problems Algorithms SMS-EMOA : The S Metric Selection Evolutionary Multiobjective Algorithm is based on the hypervolume performance measure (Emmerich et al., 2005; Beume et al., 2007). SPAM : The Set Preference Algorithm for Multiobjective optimization is meant to generalize IBEA by allowing any sort of set preference relation (Zitzler et al., 2008). Carlos A. Coello Coello Recent Results and Open Problems in EMO

  15. Contents Algorithms Introduction MOEAs for Expensive Objective Functions Recent Results and Open Problems Self-Adaptation and Online Adaptation The Challenges of this Century Scalability Conclusions Recent Results and Open Problems Why to use Hypervolume? The Hypervolume (also known as the S metric or the Lebesgue Measure) of a set of solutions measures the size of the portion of objective space that is dominated by those solutions collectively. Its Good Side Advantage : It has been proved that the maximization of this performance measure is equivalent to finding the Pareto optimal set (Fleischer, 2003). Carlos A. Coello Coello Recent Results and Open Problems in EMO

  16. Contents Algorithms Introduction MOEAs for Expensive Objective Functions Recent Results and Open Problems Self-Adaptation and Online Adaptation The Challenges of this Century Scalability Conclusions Recent Results and Open Problems Its Good Side Advantage : Empirical studies have shown that (for a certain number of points previously determined) the maximization of the hypervolume does indeed produce subsets of the Pareto front which are well-distributed (Knowles, 2003; Emmerich, 2005). Advantage : It measures convergence and, to a certain extent, also the spread of solutions along the Pareto front. Carlos A. Coello Coello Recent Results and Open Problems in EMO

  17. Contents Algorithms Introduction MOEAs for Expensive Objective Functions Recent Results and Open Problems Self-Adaptation and Online Adaptation The Challenges of this Century Scalability Conclusions Recent Results and Open Problems Its Bad Side Disadvantage : The computation of this performance measure depends of a reference point, which can influence the results in a significant manner. Some people have proposed to use the worst objective function values in the current population, but this requires scaling of the objectives. Disadvantage : The best algorithms known to compute hypervolume have a polynomial complexity on the number of points used, but such complexity grows exponentially on the number of objectives! Carlos A. Coello Coello Recent Results and Open Problems in EMO

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend