automatic configuration of benchmark sets for classical
play

Automatic Configuration of Benchmark Sets for Classical Planning - PowerPoint PPT Presentation

The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Automatic Configuration of Benchmark Sets for Classical Planning Alvaro Torralba, 1 Jendrik Seipp, 2 Silvan Sievers 2 1 Aalborg University, Denmark 2


  1. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Automatic Configuration of Benchmark Sets for Classical Planning Alvaro Torralba, 1 Jendrik Seipp, 2 Silvan Sievers 2 ´ 1 Aalborg University, Denmark 2 University of Basel, Switzerland October 21, 2020 Automatic Configuration of Benchmark Sets for Classical Planning 1/25

  2. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Outline The ICAPS Way 1 Benchmark Design Principles 2 Benchmark Configuration 3 Evaluation 4 Conclusion 5 Automatic Configuration of Benchmark Sets for Classical Planning 2/25

  3. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion The Cycle of Life (in Planning Research) Everything you Always Wanted to Know About Planning (But Were Afraid to Ask) — (J¨ org Hoffmann, 2011) Automatic Configuration of Benchmark Sets for Classical Planning 3/25

  4. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion The Cycle of Life (in Planning Research) Everything you Always Wanted to Know About Planning (But Were Afraid to Ask) — (J¨ org Hoffmann, 2011) Automatic Configuration of Benchmark Sets for Classical Planning 3/25

  5. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion The Cycle of Life (in Planning Research) Everything you Always Wanted to Know About Planning (But Were Afraid to Ask) — (J¨ org Hoffmann, 2011) Automatic Configuration of Benchmark Sets for Classical Planning 3/25

  6. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – Examples from HSDIP’20 Automatic Configuration of Benchmark Sets for Classical Planning 4/25

  7. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – Examples from HSDIP’20 Automatic Configuration of Benchmark Sets for Classical Planning 4/25

  8. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – Examples from HSDIP’20 Automatic Configuration of Benchmark Sets for Classical Planning 4/25

  9. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – Examples from HSDIP’20 Automatic Configuration of Benchmark Sets for Classical Planning 4/25

  10. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – Examples from HSDIP’20 Automatic Configuration of Benchmark Sets for Classical Planning 4/25

  11. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – Examples from HSDIP’20 Automatic Configuration of Benchmark Sets for Classical Planning 4/25

  12. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – Examples from HSDIP’20 Automatic Configuration of Benchmark Sets for Classical Planning 4/25

  13. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – Examples from HSDIP’20 Automatic Configuration of Benchmark Sets for Classical Planning 4/25

  14. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – The ICAPS/IPC Way The ICAPS/IPC Way Measure coverage Time limit 30 minutes Memory limit 2-8 GB Use the benchmarks from the International Planning Competition Automatic Configuration of Benchmark Sets for Classical Planning 5/25

  15. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – The ICAPS/IPC Way The ICAPS/IPC Way Measure coverage Time limit 30 minutes Memory limit 2-8 GB Use the benchmarks from the International Planning Competition Having a standard evaluation setting is generally beneficial: Reproducibility Interpretability Avoids hand picking results Automatic Configuration of Benchmark Sets for Classical Planning 5/25

  16. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Empirical Evaluation – The ICAPS/IPC Way The ICAPS/IPC Way Measure coverage Time limit 30 minutes Memory limit 2-8 GB Use the benchmarks from the International Planning Competition Having a standard evaluation setting is generally beneficial: Reproducibility Interpretability Avoids hand picking results Automatic Configuration of Benchmark Sets for Classical Planning 5/25

  17. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Outline The ICAPS Way 1 Benchmark Design Principles 2 Benchmark Configuration 3 Evaluation 4 Conclusion 5 Automatic Configuration of Benchmark Sets for Classical Planning 6/25

  18. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion The diversity in the IPC Benchmark Set Automatic Configuration of Benchmark Sets for Classical Planning 7/25

  19. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion So, What’s Wrong with the IPC Benchmark Set? IPC L D O Nomystery (20) 11 20 12 Rovers (40) 40 40 40 Woodworking (50) 50 50 50 Total 101 110 102 Table: Coverage of LAMA (L), Decstar (D) and OLCFF (O) Automatic Configuration of Benchmark Sets for Classical Planning 8/25

  20. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion So, What’s Wrong with the IPC Benchmark Set? IPC L D O Nomystery (20) 11 20 12 Rovers (40) 40 40 40 Woodworking (50) 50 50 50 Total 101 110 102 Table: Coverage of LAMA (L), Decstar (D) and OLCFF (O) Different number of instances per domain Instance scaling: too easy, too hard, and not smooth Automatic Configuration of Benchmark Sets for Classical Planning 8/25

  21. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion So, What’s Wrong with the IPC Benchmark Set? IPC New’14 L D O L D O Nomystery (20) 11 20 12 25 30 24 Rovers (40) 40 40 40 22 18 21 Woodworking (50) 50 50 50 18 27 30 Total 101 110 102 65 75 75 Table: Coverage of LAMA (L), Decstar (D) and OLCFF (O) Different number of instances per domain Instance scaling: too easy, too hard, and not smooth → Experiments on some domains of the IPC benchmark set may not observe any difference between planners even if it exists! Automatic Configuration of Benchmark Sets for Classical Planning 8/25

  22. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Non-Smooth Scaling uns. 10 3 Time (s) 10 2 Complementary 2, IPC 10 1 Delfi-blind, IPC Automatic Configuration of Benchmark Sets for Classical Planning 9/25

  23. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Smooth Scaling uns. 10 3 10 2 Time (s) 10 1 10 0 10 − 1 Complementary 2, New’14 10 − 2 Delfi-blind, New’14 Automatic Configuration of Benchmark Sets for Classical Planning 10/25

  24. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Contribution An automatic tool to select instances from a given domain (more informative than the IPC set to compare current and future planners) Automatic Configuration of Benchmark Sets for Classical Planning 11/25

  25. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Contribution An automatic tool to select instances from a given domain (more informative than the IPC set to compare current and future planners) Smooth scaling from easy to hard instances: 1 Automatic Configuration of Benchmark Sets for Classical Planning 11/25

  26. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Contribution An automatic tool to select instances from a given domain (more informative than the IPC set to compare current and future planners) Smooth scaling from easy to hard instances: 1 Easy: solvable by any planner that anyone would compare against (baseline) Hard: out of reach of current existing planners within a reasonable time limit Automatic Configuration of Benchmark Sets for Classical Planning 11/25

  27. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Contribution An automatic tool to select instances from a given domain (more informative than the IPC set to compare current and future planners) Smooth scaling from easy to hard instances: 1 Easy: solvable by any planner that anyone would compare against (baseline) Hard: out of reach of current existing planners within a reasonable time limit Minimize bias towards/against planners used 2 Automatic Configuration of Benchmark Sets for Classical Planning 11/25

  28. The ICAPS Way Benchmark Design Principles Benchmark Configuration Evaluation Conclusion Outline The ICAPS Way 1 Benchmark Design Principles 2 Benchmark Configuration 3 Evaluation 4 Conclusion 5 Automatic Configuration of Benchmark Sets for Classical Planning 12/25

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend