automated generation of high performance heuristics from
play

Automated generation of high-performance heuristics from flexible - PowerPoint PPT Presentation

Automated generation of high-performance heuristics from flexible algorithm frameworks: Challenges and Perspectives Thomas St utzle IRIDIA, CoDE, Universit e Libre de Bruxelles (ULB) Brussels, Belgium stuetzle@ulb.ac.be


  1. Automated generation of high-performance heuristics from flexible algorithm frameworks: Challenges and Perspectives Thomas St¨ utzle IRIDIA, CoDE, Universit´ e Libre de Bruxelles (ULB) Brussels, Belgium stuetzle@ulb.ac.be iridia.ulb.ac.be/~stuetzle IRIDIA Institut de Recherches Interdisciplinaires et de Développements en Intelligence Artificielle

  2. Outline 1. What we have been done / are doing 2. Next steps?, challenges and perspectives Automated Algorithm Selection and Configuration, Dagstuhl, Germany 2

  3. Automatic offline configuration / algorithm design Typical performance measures ◮ maximize solution quality (within given computation time) ◮ minimize computation time (to reach optimal solution) Automated Algorithm Selection and Configuration, Dagstuhl, Germany 3

  4. Configurators Automated Algorithm Selection and Configuration, Dagstuhl, Germany 4

  5. The irace package Manuel L´ opez-Ib´ a˜ nez, J´ er´ emie Dubois-Lacoste, Thomas St¨ utzle, and Mauro Birattari. The irace package, Iterated Race for Automatic Algorithm Configuration. Technical Report TR/IRIDIA/2011-004 , IRIDIA, Universit´ e Libre de Bruxelles, Belgium, 2011. The irace Package: User Guide, 2016, Technical Report TR/IRIDIA/2016-004 http://iridia.ulb.ac.be/irace ◮ implementation of Iterated Racing in R Goal 1: flexible Goal 2: easy to use ◮ but no knowledge of R necessary ◮ parallel evaluation (MPI, multi-cores, grid engine .. ) ◮ currently various extensions done (soon publicly available) irace has shown to be effective for configuration tasks with several hundred of variables Automated Algorithm Selection and Configuration, Dagstuhl, Germany 5

  6. Automatic design of algorithms from algorithm frameworks Automated Algorithm Selection and Configuration, Dagstuhl, Germany 6

  7. General approach Automated Algorithm Selection and Configuration, Dagstuhl, Germany 7

  8. Main approaches Top-down approaches ◮ develop flexible framework following a fixed algorithm template with alternatives ◮ apply high-performing configurators ◮ Examples: Satenstein, MOACO, MOEA, MIP Solvers?(!) Bottom-up approaches ◮ flexible framework implementing algorithm components ◮ define rules for composing algorithms from components e.g. through grammars ◮ frequently usage of genetic programming, grammatical evolution etc. Automated Algorithm Selection and Configuration, Dagstuhl, Germany 8

  9. Top-down configuration Multi-objective evolutionary algorithms (MOEA) Automated Algorithm Selection and Configuration, Dagstuhl, Germany 9

  10. Multi-objective evolutionary algorithms Pareto based Indicator based Weight based (NSGA-II, SPEA2) (IBEA, SMS-EMOA) (MOGLS, MOEA/D) We initially focused on building an automatically configurable component-wise framework for Pareto- and indicator-based MOEAs Automated Algorithm Selection and Configuration, Dagstuhl, Germany 10

  11. MOEA Framework — outline [Bezerra, L´ opes-Ib´ a˜ nez, St¨ utzle, 2016] Automated Algorithm Selection and Configuration, Dagstuhl, Germany 11

  12. Preference relations in mating / replacement Component Parameters Preference � Set-partitioning, Quality, Diversity � BuildMatingPool � Preference Mat , Selection � Replacement � Preference Rep , Removal � Replacement Ext � Preference Ext , Removal Ext � Automated Algorithm Selection and Configuration, Dagstuhl, Germany 12

  13. Representing known MOEAs BuildMatingPool Replacement Alg. SetPart Quality Diversity Selection SetPart Quality Diversity Removal MOGA rank — niche-sh. DT — — — generational NSGA-II depth — crowding DT depth — crowding one-shot SPEA2 strength — kNN DT strength — kNN sequential IBEA — binary — DT — binary — one-shot I h I h HypE — — DT depth — sequential H H I 1 SMS — — — random depth-rank — — H (All MOEAs above use fixed size population and no external archive; in addition, SMS-EMOA uses λ = 1) Automated Algorithm Selection and Configuration, Dagstuhl, Germany 13

  14. MO configuration irace + performance indicator (e.g. hypervolume) = automatic configuration of multi-objective solvers! Automated Algorithm Selection and Configuration, Dagstuhl, Germany 14

  15. Experimental results DTLZ WFG 2-obj 3-obj 5-obj 2-obj 3-obj 5-obj ∆ R = 126 ∆ R = 127 ∆ R = 107 ∆ R = 169 ∆ R = 130 ∆ R = 97 Auto D2 Auto D3 Auto D5 Auto W2 Auto W3 Auto W5 (1339) (1500) (1002) (1692) (1375) (1170) SPEA2 D2 IBEA D3 SMS D5 SPEA2 W2 SMS W3 SMS W5 (1562) (1719) (1550) (2097) (1796) (1567) IBEA D2 SMS D3 IBEA D5 NSGA-II W2 IBEA W3 IBEA W5 (1940) (1918) (1867) (2542) (1843) (1746) NSGA-II D2 HypE D3 SPEA2 D5 SMS W2 SPEA2 W3 SPEA2 W5 (2143) (2019) (2345) (2621) (2600) (2747) HypE D2 SPEA2 D3 NSGA-II D5 IBEA W2 NSGA-II W3 NSGA-II W5 (2338) (2164) (2346) (2777) (3315) (3029) SMS D2 NSGA-II D3 HypE D5 HypE W2 HypE W3 MOGA W5 (2406) (2528) (2674) (2851) (3431) (4268) MOGA D2 MOGA D3 MOGA D5 MOGA W2 MOGA W3 HypE W5 (2970) (2851) (2915) (4320) (4540) (4373) Automated Algorithm Selection and Configuration, Dagstuhl, Germany 15

  16. Additional remarks ◮ additional results ◮ time-constrained scenarios ◮ cross-benchmark comparison ◮ applications to multi-objective flow-shop scheduling ◮ extended version of AutoMOEA ◮ extensions of template (weights, local search, etc.) ◮ more comprehensive benchmarks sets ◮ in-depth comparison of MOEAs ◮ design space analysis (e.g. ablation) ◮ Other examples ◮ Single-objective frameworks for MIP: CPLEX, SCIP ◮ Single-objective framework for SAT, SATenstein ◮ Continuous optimization: UACOR, ABC-X ◮ Multi-objective algorithm frameworks (TP+PLS, MOACO) Automated Algorithm Selection and Configuration, Dagstuhl, Germany 16

  17. Bottom up configuration of hybrid SLS algorithms Automatic design of hybrid SLS algorithms Automated Algorithm Selection and Configuration, Dagstuhl, Germany 17

  18. Automatic design of hybrid SLS algorithms [Marmion, Mascia, L´ opes-Ib´ a˜ nez, St¨ utzle, 2013] Approach ◮ decompose single-point SLS methods into components ◮ derive generalized metaheuristic structure ◮ component-wise implementation of metaheuristic part Implementation ◮ present possible algorithm compositions by a grammar ◮ instantiate grammer using a parametric representation ◮ allows use of standard automatic configuration tools ◮ shows good performance when compared to, e.g., grammatical evolution [Mascia, L´ opes-Ib´ a˜ nez, Dubois-Lacoste, St¨ utzle, 2014] Automated Algorithm Selection and Configuration, Dagstuhl, Germany 18

  19. General Local Search Structure: ILS s 0 :=initSolution s ∗ := ls( s 0 ) repeat s ′ :=perturb( s ∗ , history ) s ∗′ :=ls( s ′ ) s ∗ :=accept( s ∗ , s ∗′ , history ) until termination criterion met ◮ many SLS methods instantiable from this structure ◮ abilities ◮ hybridization (through recursion) ◮ problem specific implementation at low-level ◮ separation of generic and problem-specific components Automated Algorithm Selection and Configuration, Dagstuhl, Germany 19

  20. Grammar <algorithm> ::= <initialization> <ils> <initialization> ::= random | <pbs_initialization> <ils> ::= ILS(<perturb>, <ls>, <accept>, <stop>) <perturb> ::= none | <initialization> | <pbs_perturb> <ls> ::= <ils> | <descent> | <sa> | <rii> | <pii> | <vns> | <ig> | <pbs_ls> <accept> ::= alwaysAccept | improvingAccept <comparator> | prob(<value_prob_accept>) | probRandom | <metropolis> | threshold(<value_threshold_accept>) | <pbs_accept> <descent> ::= bestDescent(<comparator>, <stop>) | firstImprDescent(<comparator>, <stop>) <sa> ::= ILS(<pbs_move>, no_ls, <metropolis>, <stop>) <rii> ::= ILS(<pbs_move>, no_ls, probRandom, <stop>) <pii> ::= ILS(<pbs_move>, no_ls, prob(<value_prob_accept>), <stop>) <vns> ::= ILS(<pbs_variable_move>, firstImprDescent(improvingStrictly), improvingAccept(improvingStrictly), <stop>) <ig> ::= ILS(<deconst-construct_perturb>, <ls>, <accept>, <stop>) <comparator> ::= improvingStrictly | improving <value_prob_accept> ::= [0, 1] <value_threshold_accept> ::= [0, 1] <metropolis> ::= metropolisAccept(<init_temperature>, <final_temperature>, <decreasing_temperature_ratio>, <span>) <init_temperature> ::= {1, 2,..., 10000} <final_temperature> ::= {1, 2,..., 100} <decreasing_temperature_ratio> ::= [0, 1] <span> ::= {1, 2,..., 10000} Automated Algorithm Selection and Configuration, Dagstuhl, Germany 20

  21. System overview Automated Algorithm Selection and Configuration, Dagstuhl, Germany 21

  22. Flow-shop problem with makespan objective [Pagnozzi, St¨ utzle, 2016] ◮ Automatic configuration: ◮ max. three levels of recursion ◮ biased / unbiased grammar resulting in 262 and 502 parameters, respectively ◮ budget: 200 000 trials of n · m · 0 . 03 seconds 95% confidence limits 0.30 0.28 ARPD 0.26 0.24 0.22 IGrs IGtb irace1 irace2 irace3 irace4 Algorithms Results are clearly superior to state-of-the-art Automated Algorithm Selection and Configuration, Dagstuhl, Germany 22

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend