a futurist approach to dynamic environments
play

A Futurist approach to dynamic environments Jano van Hemert, Leiden - PDF document

A Futurist approach to dynamic environments Jano van Hemert, Leiden University, The Netherlands & Clarissa Van Hoyweghen, University of Antwerp, Belgium & Eduard Lukschandl, Ericsson & Hewlett-Packard & Katja


  1. ✬ ✩ A “Futurist” approach to dynamic environments Jano van Hemert, Leiden University, The Netherlands & Clarissa Van Hoyweghen, University of Antwerp, Belgium & Eduard Lukschandl, Ericsson & Hewlett-Packard & Katja Verbeeck, University of Brussels, Belgium presented by Jano van Hemert jvhemert@liacs.nl http://www.liacs.nl/~jvhemert ✫ ✪

  2. ✬ ✩ Introduction 2 How it all started Coil Summer School 2000, Limerick, Ireland ☞ People assigned to groups to solve different problems ☞ Conor Ryan provided our group with two dynamic problems ☞ He has attempted to solve those problems using diploid chromosomes ☞ Our objective set was to try to solve them using one of the techniques presented at the summer school ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  3. ✬ ✩ Introduction 3 How it all started Coil Summer School 2000, Limerick, Ireland ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  4. ✬ ✩ Introduction 4 The next half hour ① Problem descriptions ② General idea ③ Two tested implementation ④ Experiments & Results ⑤ Conclusions & Future Work ⑥ Questions & Discussion ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  5. ✬ ✩ Problem descriptions 5 0 − 1 Knapsack – Definition ✔ Goal is to fill a knapsack with objects ✔ Each object has a weight and value assigned ✔ Every 15 generations the maximum allowed weight is changed ✔ Maximum weight is switched between 50% and 80% of the total weight of all the objects ✔ Total of 400 generations (time steps) is used ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  6. � ✬ ✩ Problem descriptions 6 0 − 1 Knapsack – Behaviour 90 optimal value 80 70 0 50 100 150 200 250 300 350 400 generations ☞ Optimum changes over time ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  7. ✬ ✩ Problem descriptions 7 O˘ smera’s function — Definition g 1 ( x, t ) = 1 − e 200( x − c ( t )) 2 with c ( t ) = 0 . 04( ⌊ t/ 20 ⌋ ), x ∈ { 0 . 000 , . . . 2 . 000 } , each time step t ∈ { 0 , . . . 1000 } equal to one generation ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  8. � ✬ ✩ Problem descriptions 8 O˘ smera’s function — Behaviour 0.25 0.2 g(x,t) 1 0.15 c(t) 0.5 0.1 0 0 0.5 0.05 0 1 200 x 400 1.5 600 t 800 0 2 0 20 40 60 80 100 t ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  9. ✬ ✩ General idea 9 Predicting the future f(x,t) use predictor for future population 0 t t+ ∆ maxgen regress fitness predictor acquire fitness values ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  10. ✬ ✩ General idea 10 Learning from the future migration current future population population ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  11. ✬ ✩ General idea 11 Parameters ✔ m determines how many individuals are copied to the current population, best m from the future are selected and overwrite the worst m in the current population ✔ ∆ determines how many generations ahead the future population lives ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  12. ✬ ✩ Experimental setup 12 Two experiments Perfect prediction ☞ Idea is that the best what could happen is that you have a perfect prediction of the future ☞ With these problems this is very easy to implement as we know exactly the optimum for t + ∆ ☞ If this is not successful, we could ask ourselves if it is useful to continue with the idea of predicting the future Noisy prediction ☞ Could the use of a predictor be harmful? ☞ We give the algorithm noisy and deceptive predictions of the future ☞ Knapsack problem gets wrong optimum (deceptive) and O˘ smera gets a ✫ ✪ random value gecco -2001 Workshop on Dynamic Optimization

  13. ✬ ✩ Experimental & Results 13 Experimental setup For both problems we do ✔ a test without any future population ✔ tests with four parameter settings (two pairs) for perfect prediction ✔ tests with four parameter settings (two pairs) for noisy / deceptive predictor ✔ 50 runs for each test with unique random seeds ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  14. ✬ ✩ Experiments & Results 14 Knapsack results ∆ predictor m error stdev best run none 16.6% 3.52 8.96% × × perfect 5 10 11.9% 3.77 4.85% perfect 15 10 20.3% 4.26 11.7% perfect 5 50 11.8% 3.70 5.97% perfect 15 50 21.4% 6.06 12.0% deceptive 5 10 12.6% 4.12 6.77% deceptive 15 10 13.0% 3.74 7.50% deceptive 5 50 12.7% 4.02 6.47% deceptive 15 50 12.8% 4.07 4.99% ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  15. � � � ✬ ✩ Experiments & Results 15 Knapsack results 25 "knapsack_clean" 20 15 Error (%) 10 5 0 0 50 100 150 200 250 300 350 400 Generations 25 25 "knapsack_perfect_d5_m10" "knapsack_noisy_d15_m50" 20 20 15 15 Error (%) Error (%) 10 10 5 5 0 0 0 50 100 150 200 250 300 350 400 0 50 100 150 200 250 300 350 400 ✫ ✪ Generations Generations gecco -2001 Workshop on Dynamic Optimization

  16. ✬ ✩ Experiments & Results 16 O˘ smera results ∆ predictor m error stdev best run none 63.8% 10.3 41.2% × × perfect 5 10 0.261% 0.153 0.0751% perfect 5 50 0.168% 0.148 0.0266% perfect 10 10 0.241% 0.220 0.0680% perfect 10 50 0.203% 0.099 0.0698% noisy 5 10 0.241% 0.186 0.0488% noisy 5 50 0.144% 0.122 0.0358% noisy 10 10 0.241% 0.186 0.0488% noisy 10 50 0.168% 0.148 0.0266% ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  17. � � � ✬ ✩ Experiments & Results 17 O˘ smera results 100 "osmera_clean" 90 80 70 60 Error (%) 50 40 30 20 10 0 0 100 200 300 400 500 600 700 800 900 1000 Generations 0.6 0.6 "osmera_perfect_d5_m50" "osmera_noisy_d10_m50" 0.5 0.5 0.4 0.4 Error (%) Error (%) 0.3 0.3 0.2 0.2 0.1 0.1 0 0 0 200 400 600 800 1000 0 200 400 600 800 1000 ✫ ✪ Generations Generations gecco -2001 Workshop on Dynamic Optimization

  18. ✬ ✩ Conclusions & Future Research 18 Conclusions Pros and cons ✘ Knapsack problem is better solved with a look-a-head time of 5 generations as opposed to 15, which is the length of the cycle ✔ Adding future predictions when solving the knapsack problem slightly improves the performance when using a deceptive function or when using small values for m ✔ Adding future predictions when solving O˘ smera’s function seems to help ✘ There is little difference in performance between using a perfect or noisy predictor... ✔ There is little difference in performance between using a perfect or noisy predictor... ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

  19. ✬ ✩ Conclusions & Future Research 19 Future Research ☞ How sensitive are the parameters m and ∆? ☞ Why does this work well for a real-valued problem and not for a problem from a discrete domain? ☞ Could we replace this whole complicated process by adding more disturbance? For instance with a high mutation rate? ✫ ✪ gecco -2001 Workshop on Dynamic Optimization

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend