automatic algorithm configuration
play

Automatic Algorithm Configuration Thomas St utzle IRIDIA, CoDE, - PDF document

Automatic Algorithm Configuration Thomas St utzle IRIDIA, CoDE, Universit e Libre de Bruxelles Brussels, Belgium stuetzle@ulb.ac.be iridia.ulb.ac.be/~stuetzle Outline 1. Context 2. Automatic algorithm configuration 3. Applications 4.


  1. Automatic Algorithm Configuration Thomas St¨ utzle IRIDIA, CoDE, Universit´ e Libre de Bruxelles Brussels, Belgium stuetzle@ulb.ac.be iridia.ulb.ac.be/~stuetzle Outline 1. Context 2. Automatic algorithm configuration 3. Applications 4. Concluding remarks TPNC 2017, Prague, Czech Republic 2

  2. The algorithmic solution of hard optimization problems is one of the CS/OR success stories! I Exact (systematic search) algorithms I Branch&Bound, Branch&Cut, constraint programming, . . . I powerful general-purpose software available I guarantees on optimality but often time/memory consuming I Approximate algorithms I heuristics, local search, metaheuristics, hyperheuristics . . . I typically special-purpose software I rarely provable guarantees but often fast and accurate Much active research on hybrids between exact and approximate algorithms! TPNC 2017, Prague, Czech Republic 3 Design choices and parameters everywhere Todays high-performance optimizers involve a large number of design choices and parameter settings I exact solvers I design choices include alternative models, pre-processing, variable selection, value selection, branching rules . . . I many design choices have associated numerical parameters I example: SCIP 3.0.1 solver (fastest non-commercial MIP solver) has more than 200 relevant parameters that influence the solver’s search mechanism I approximate algorithms I design choices include solution representation, operators, neighborhoods, pre-processing, strategies, . . . I many design choices have associated numerical parameters I example: multi-objective ACO algorithms with 22 parameters (plus several still hidden ones) TPNC 2017, Prague, Czech Republic 4

  3. ILS design choices and numerical parameters I choice of constructive procedures I random vs. greedy construction I static vs. adaptive construction I how many initial solutions? I perturbation I type of perturbation I fixed vs. variable size of destruction I random vs. biased perturbation I acceptance criterion I strength of bias towards best solutions I use of history information and if yes, how I local search I . . . many choices . . . I numerical parameters I perturbation size I parameters associated type of perturbation I parameters related to acceptance criterion TPNC 2017, Prague, Czech Republic 5 Parameter types I categorical parameters design I choice of constructive procedure, choice of recombination operator, choice of branching strategy, . . . I ordinal parameters design I neighborhoods, lower bounds, . . . I numerical parameters tuning, calibration I integer or real-valued parameters I weighting factors, population sizes, temperature, hidden constants, . . . I numerical parameters may be conditional to specific values of categorical or ordinal parameters Design and configuration of algorithms involves setting categorical, ordinal, and numerical parameters TPNC 2017, Prague, Czech Republic 6

  4. Designing optimization algorithms Challenges I many alternative design choices I nonlinear interactions among algorithm components and/or parameters I performance assessment is di ffi cult Traditional design approach I trial–and–error design guided by expertise/intuition prone to over-generalizations, implicit independence assumptions, limited exploration of design alternatives Can we make this approach more principled and automatic? TPNC 2017, Prague, Czech Republic 7 Towards automatic algorithm configuration Automated algorithm configuration I apply powerful search techniques to design algorithms I use computation power to explore design spaces I assist algorithm designer in the design process I free human creativity for higher level tasks TPNC 2017, Prague, Czech Republic 8

  5. O ffl ine configuration and online parameter control O ffl ine configuration I configure algorithm before deploying it I configuration on training instances I related to algorithm design Online parameter control I adapt parameter setting while solving an instance I typically limited to a set of known crucial algorithm parameters I related to parameter calibration O ffl ine configuration techniques can be helpful to configure (online) parameter control strategies TPNC 2017, Prague, Czech Republic 9 O ffl ine configuration Typical performance measures I maximize solution quality (within given computation time) I minimize computation time (to reach optimal solution) TPNC 2017, Prague, Czech Republic 10

  6. Configurators TPNC 2017, Prague, Czech Republic 11 Approaches to configuration I experimental design techniques I e.g. CALIBRA [Adenso–D´ ıaz, Laguna, 2006], [Ridge&Kudenko, 2007], [Coy et al., 2001], [Ruiz, St¨ utzle, 2005] I numerical optimization techniques I e.g. MADS [Audet&Orban, 2006], various [Yuan et al., 2012] I heuristic search methods I e.g. meta-GA [Grefenstette, 1985], ParamILS [Hutter et al., 2007, 2009], gender-based GA [Ans´ otegui at al., 2009], linear GP [Oltean, 2005], REVAC(++) [Eiben et al., 2007, 2009, 2010] . . . I model-based optimization approaches I e.g. SPO [Bartz-Beielstein et al., 2005, 2006, .. ], SMAC [Hutter et al., 2011, ..], GGA++ [Ans´ otegui, 2015] I sequential statistical testing I e.g. F-race, iterated F-race [Birattari et al, 2002, 2007, . . . ] General, domain-independent methods required: (i) applicable to all variable types, (ii) multiple training instances, (iii) high performance, (iv) scalable TPNC 2017, Prague, Czech Republic 12

  7. Approaches to configuration I experimental design techniques I e.g. CALIBRA [Adenso–D´ ıaz, Laguna, 2006], [Ridge&Kudenko, 2007], [Coy et al., 2001], [Ruiz, St¨ utzle, 2005] I numerical optimization techniques I e.g. MADS [Audet&Orban, 2006], various [Yuan et al., 2012] I heuristic search methods I e.g. meta-GA [Grefenstette, 1985], ParamILS [Hutter et al., 2007, 2009], gender-based GA [Ans´ otegui at al., 2009], linear GP [Oltean, 2005], REVAC(++) [Eiben et al., 2007, 2009, 2010] . . . I model-based optimization approaches I e.g. SPO [Bartz-Beielstein et al., 2005, 2006, .. ], SMAC [Hutter et al., 2011, ..], GGA++ [Ans´ otegui, 2015] I sequential statistical testing I e.g. F-race, iterated F-race [Birattari et al, 2002, 2007, . . . ] General, domain-independent methods required: (i) applicable to all variable types, (ii) multiple training instances, (iii) high performance, (iv) scalable TPNC 2017, Prague, Czech Republic 13 The racing approach Θ I start with a set of initial candidates I consider a stream of instances I sequentially evaluate candidates I discard inferior candidates as su ffi cient evidence is gathered against them I . . . repeat until a winner is selected or until computation time expires i TPNC 2017, Prague, Czech Republic 14

  8. The F-Race algorithm Statistical testing 1. family-wise tests for di ff erences among configurations I Friedman two-way analysis of variance by ranks 2. if Friedman rejects H 0 , perform pairwise comparisons to best configuration I apply Friedman post-test TPNC 2017, Prague, Czech Republic 15 Iterated race Racing is a method for the selection of the best configuration and independent of the way the set of configurations is sampled Iterated race sample configurations from initial distribution While not terminate() apply race modify sampling distribution sample configurations TPNC 2017, Prague, Czech Republic 16

  9. Iterated race: sampling { 0.4 { 0.2 0.0 x 1 x 2 x 3 0.4 0.2 0.0 x 1 x 2 x 3 TPNC 2017, Prague, Czech Republic 17 Iterated racing: sampling distributions Numerical parameter X d ∈ [ x d , x d ] ⇒ Truncated normal distribution N ( µ z d , σ i d ) ∈ [ x d , x d ] µ z d = value of parameter d in elite configuration z σ i d = decreases with the number of iterations Categorical parameter X d ∈ { x 1 , x 2 , . . . , x n d } ⇒ Discrete probability distribution 0.4 0.2 x 1 x 2 x n d . . . Pr z { X d = x j } = 0.1 0.3 . . . 0.4 0.0 x 3 x 1 x 2 I Updated by increasing probability of parameter value in elite configuration I Other probabilities are reduced TPNC 2017, Prague, Czech Republic 18

  10. The irace package Manuel L´ opez-Ib´ a˜ nez, J´ er´ emie Dubois-Lacoste, Thomas St¨ utzle, and Mauro Birattari. The irace package, Iterated Race for Automatic Algorithm Configuration. Technical Report TR/IRIDIA/2011-004 , IRIDIA, Universit´ e Libre de Bruxelles, Belgium, 2011; extended version: Operations Research Perspectives, 2016 . The irace Package: User Guide, 2016, Technical Report TR/IRIDIA/2016-004 http://iridia.ulb.ac.be/irace I implementation of Iterated Racing in R Goal 1: flexible Goal 2: easy to use I but no knowledge of R necessary I parallel evaluation (MPI, multi-cores, grid engine .. ) I capping for run-time optimization irace has shown to be e ff ective for configuration tasks with several hundred of variables TPNC 2017, Prague, Czech Republic 19 The irace package: usage Parameter Configuration Training space scenario instances calls with θ,i returns c(θ,i) irace irace targetRunner TPNC 2017, Prague, Czech Republic 20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend