other simple metaheuristics
play

Other Simple Metaheuristics 3. Iterated Local Search and Population - PowerPoint PPT Presentation

Outline DM63 HEURISTICS FOR COMBINATORIAL OPTIMIZATION 1. Results Task 2 2. Dynamic Local Search Lecture 9 Other Simple Metaheuristics 3. Iterated Local Search and Population Based Metaheuristics 4. Exercise Marco Chiarandini 5.


  1. Outline DM63 HEURISTICS FOR COMBINATORIAL OPTIMIZATION 1. Results Task 2 2. Dynamic Local Search Lecture 9 Other ’Simple’ Metaheuristics 3. Iterated Local Search and Population Based Metaheuristics 4. Exercise Marco Chiarandini 5. Evolutionary Algorithms DM63 – Heuristics for Combinatorial Optimization Problems 2 Outline Experimental Set up ◮ 15 new flat instances created Type # instances Upper bound 1. Results Task 2 flat-1000-50-0-?.col 5 50 flat-1000-60-0-?.col 5 60 2. Dynamic Local Search flat-1000-76-0-?.col 5 76 3. Iterated Local Search ◮ each algorithm run once on each of the 15 new instances ◮ fairness principle: same computational resources to all algorithms 4. Exercise ⇒ 90 seconds on Intel(R) Celeron(R) CPU 2.40GHz, 1GB RAM (120 seconds for 230183) 5. Evolutionary Algorithms ◮ restart ROS heuristic used as reference algorithm ◮ restart RLF and DSATUR also included DM63 – Heuristics for Combinatorial Optimization Problems 3 DM63 – Heuristics for Combinatorial Optimization Problems 4

  2. Results Results x − x x − x opt Standard error: Percentage error: x opt % σ 76 ROS ● ROS ● ROS 240284 ● 240284 240284 141179 230183 141179 141179 ● DSATUR 230183 230183 RLF DSATUR DSATUR 191076 RLF ● RLF 270383 191076 191076 60 270383 270383 ROS 240284 141179 ● ● −1.5 −0.5 0.0 0.5 1.0 1.5 40 60 80 100 120 140 230183 DSATUR ● x − x opt RLF Ranks Ranks Invariant error: 191076 x ROS − x opt 270383 50 ROS ROS ROS ROS 240284 240284 240284 240284 141179 141179 141179 141179 230183 230183 230183 230183 DSATUR DSATUR DSATUR DSATUR RLF RLF RLF RLF 191076 270383 191076 191076 191076 270383 270383 270383 40 60 80 100 120 140 Percentage error 0.5 0.6 0.7 0.8 0.9 1.0 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 DM63 – Heuristics for Combinatorial Optimization Problems 5 DM63 – Heuristics for Combinatorial Optimization Problems 6 Results Results le450_25d.col 191076 ROS Algorithm flat-1000-50 flat-1000-60 flat-1000-76 240284 270383 98 98 99 270383 191076 105 104 105 230183 ● 141179 RLF 104 105 105 RLF DSATUR 111 111 111 DSATUR 230183 114 115 114 le450_25c.col 191076 141179 115 115 115 ROS 240284 116 116 116 240284 ROS 120 120 120 270383 230183 ● 141179 RLF DSATUR 27 28 29 30 31 32 33 Percentage error DM63 – Heuristics for Combinatorial Optimization Problems 7 DM63 – Heuristics for Combinatorial Optimization Problems 8

  3. Program Profiling Outline 1. Results Task 2 ◮ Plot the development of ◮ best visited solution quality ◮ current solution quality 2. Dynamic Local Search over time and compare with other features of the algorithm. 3. Iterated Local Search ◮ Profile time consumption per program components under Linux: gprof 4. Exercise 1. add flag -pg in compilation 2. run the program 3. gprof program-file > a.txt 5. Evolutionary Algorithms DM63 – Heuristics for Combinatorial Optimization Problems 9 DM63 – Heuristics for Combinatorial Optimization Problems 10 Dynamic Local Search Dynamic Local Search (continued) ◮ Modified evaluation function: ◮ Key Idea: Modify the evaluation function whenever a local optimum is encountered. � ◮ Associate penalty weights ( penalties ) with solution components; these g ′ ( π, s ) := g ( π, s ) + penalty ( i ) , determine impact of components on evaluation function value. i ∈ SC ( π ′ ,s ) ◮ Perform Iterative Improvement; when in local minimum, increase where SC ( π ′ , s ) is the set of solution components penalties of some solution components until improving steps become of problem instance π ′ used in candidate solution s . available. Dynamic Local Search (DLS): ◮ Penalty initialization: For all i : penalty ( i ) := 0 . determine initial candidate solution s initialize penalties ◮ Penalty update in local minimum s : Typically involves penalty increase While termination criterion is not satisfied: of some or all solution components of s ; often also occasional penalty | compute modified evaluation function g ′ from g | decrease or penalty smoothing . | based on penalties | | || | perform subsidiary perturbative search on s ◮ Subsidiary perturbative search: Often Iterative Improvement . | using evaluation function g ′ | | | ⌊ update penalties based on s DM63 – Heuristics for Combinatorial Optimization Problems 11 DM63 – Heuristics for Combinatorial Optimization Problems 12

  4. Example: Guided Local Search (GLS) for the TSP Potential problem: [Voudouris and Tsang 1995; 1999] Solution components required for (optimal) solution may also be present in many local minima. ◮ Given: TSP instance G ◮ Search space: Hamiltonian cycles in G with n vertices; Possible solutions: ◮ Neighborhood: 2-edge-exchange; A: Occasional decreases/smoothing of penalties. ◮ Solution components edges of G ; B: Only increase penalties of solution components that are g e ( G, p ) := w ( e ) ; least likely to occur in (optimal) solutions. ◮ Penalty initialization: Set all edge penalties to zero. Implementation of B : ◮ Subsidiary perturbative search: Iterative First Improvement. [Voudouris and Tsang, 1995] Only increase penalties of solution components i with maximal utility: ◮ Penalty update: Increment penalties for all edges with maximal utility by g i ( π, s ′ ) λ := 0.3 · w ( s 2-opt ) util ( s ′ , i ) := 1 + penalty ( i ) n where s 2-opt = 2-optimal tour. where g i ( π, s ′ ) is the solution quality contribution of i in s ′ . DM63 – Heuristics for Combinatorial Optimization Problems 13 DM63 – Heuristics for Combinatorial Optimization Problems 14 Outline Hybrid Methods 1. Results Task 2 Combination of ‘simple’ methods often yields substantial performance improvements. 2. Dynamic Local Search Simple examples: 3. Iterated Local Search ◮ Commonly used restart mechanisms can be seen as hybridisations with Uninformed Random Picking 4. Exercise ◮ Iterative Improvement + Uninformed Random Walk = Randomized Iterative Improvement 5. Evolutionary Algorithms DM63 – Heuristics for Combinatorial Optimization Problems 15 DM63 – Heuristics for Combinatorial Optimization Problems 16

  5. Iterated Local Search Key Idea: Use two types of LS steps: Note: ◮ subsidiary perturbative (local) search steps for reaching local optima as efficiently as possible (intensification) ◮ Subsidiary perturbative search results in a local minimum. ◮ perturbation steps for effectively ◮ ILS trajectories can be seen as walks in the space of escaping from local optima (diversification). local minima of the given evaluation function. Also: Use acceptance criterion to control diversification vs intensification ◮ Perturbation phase and acceptance criterion may use aspects of search behavior. history ( i.e. , limited memory). ◮ In a high-performance ILS algorithm, subsidiary perturbative search , Iterated Local Search (ILS): perturbation mechanism and acceptance criterion need to complement determine initial candidate solution s each other well. perform subsidiary perturbative search on s While termination criterion is not satisfied: | r := s | | perform perturbation on s | | perform subsidiary perturbative search on s | | || | based on acceptance criterion , ⌊ keep s or revert to s := r DM63 – Heuristics for Combinatorial Optimization Problems 17 DM63 – Heuristics for Combinatorial Optimization Problems 18 Perturbation mechanism: (1) ◮ Needs to be chosen such that its effect cannot be easily undone by subsequent perturbative search phase. Subsidiary perturbative search: (1) (Often achieved by search steps larger neighborhood.) ◮ More effective subsidiary perturbative search procedures lead to better Example: perturbative search = 3-opt, perturbation = 4-exchange steps in ILS for TSP. ILS performance. Example: 2-opt vs 3-opt vs LK for TSP. ◮ A perturbation phase may consist of one or more ◮ Often, subsidiary perturbative search = iterative improvement, perturbation steps. but more sophisticated LS methods can be used. ◮ Weak perturbation ⇒ short subsequent perturbative search phase; but: ( e.g. , Tabu Search). risk of revisiting current local minimum. ◮ Strong perturbation ⇒ more effective escape from local minima; but: may have similar drawbacks as random restart. ◮ Advanced ILS algorithms may change nature and/or strength of perturbation adaptively during search. DM63 – Heuristics for Combinatorial Optimization Problems 19 DM63 – Heuristics for Combinatorial Optimization Problems 20

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend