an empirical study of perfect potential heuristics
play

An Empirical Study of Perfect Potential Heuristics Augusto B. Corra - PowerPoint PPT Presentation

An Empirical Study of Perfect Potential Heuristics Augusto B. Corra and Florian Pommerening University of Basel, Switzerland July 14, 2019 1/13 Motivation Context: Optimal classical planning Goals: Learn more about the topology of


  1. An Empirical Study of Perfect Potential Heuristics Augusto B. Corrêa and Florian Pommerening University of Basel, Switzerland July 14, 2019 1/13

  2. Motivation Context: ◮ Optimal classical planning Goals: ◮ Learn more about the topology of different domains ◮ Study the characteristics of h ∗ ◮ Understand the limitations of potential heuristics 2/13

  3. Potential Heuristics ◮ States are represented as sets of facts ◮ A feature f is a set of facts and it has size | f | ◮ A feature f is true in a state s if f ⊆ s Definition (Potential Heuristic) A weight function w associates a set of features F with weights. It induces a potential heuristic � h pot w ( s ) = w ( f )[ f ⊆ s ] . f ∈F The dimension of h pot is the size of its largest feature f . w ◮ Higher dimension = more complex interactions between facts 3/13

  4. Potential Heuristics What if state s is unsolvable? Then h pot w ( s ) should be ∞ . � if h pot ∞ w 2 ( s ) > 0 h w 1 , w 2 ( s ) = h pot w 1 ( s ) otherwise. h w 1 , w 2 is a perfect potential heuristic if ◮ h pot w 1 ( s ) is perfect for all solvable states s ◮ h pot w 2 captures all unsolvable states correctly 4/13

  5. Optimal Correlation Complexity Definition (Optimal Correlation Complexity of a task) The optimal correlation complexity of a planning task Π is the minimum dimension of a perfect potential heuristic for Π . This gives us some insight about the complexity of the interactions between facts of the task. 5/13

  6. Optimal Correlation Complexity We study optimal correlation complexity of IPC domains empirically Computing optimal correlation complexity is hard We need... ◮ ... h ∗ for all (reachable) state space ◮ ...to find a good set of features ◮ ...to efficiently find a weight function 6/13

  7. Computing a Perfect Potential Heuristic Exact methods for finite and infinite values: ◮ Linear programs over the entire state space ◮ Initial set of candidate features F ; augment it as needed ◮ Potential heuristics found has optimal dimension 7/13

  8. Experiments Using Fast Downward and IPC domains ◮ 30 minutes and 3 . 5 GB per task ◮ 301 tasks over 38 domains where we can compute the perfect heuristic for the entire state space ◮ Sample size is considerably small 8/13

  9. Results Histogram of optimal correlation complexities 30 25 Number of instances 20 15 10 5 0 2 3 4 5 6 7 8 Optimal Correlation Complexity 9/13

  10. Results Lower bounds for the optimal correlation complexity per domain Domain Lower Bound 7 gripper 6 hiking-opt14 7 miconic 2 movie 5 nomystery-opt11 6 organic-synthesis-opt18 8 psr-small 8 rovers 5 scanalyzer-08 5 scanalyzer-opt11 5 storage 5 tpp 6 transport-opt08 8 vistall-opt11 4 zenotravel 10/13

  11. Results Lower bounds for the optimal correlation complexity per domain Domain Lower Bound 7 → 5 gripper 6 hiking-opt14 7 → 6 miconic 2 movie 5 → 4 nomystery-opt11 6 → 1 organic-synthesis-opt18 8 → 4 psr-small 8 → 5 rovers 5 scanalyzer-08 5 scanalyzer-opt11 5 → 4 storage 5 → 4 tpp 6 → 4 transport-opt08 8 → 7 vistall-opt11 4 zenotravel Significant lower complexity considering only reachable states 10/13

  12. Results Lower bounds for the optimal correlation complexity per domain Domain Lower Bound 7 → 5 gripper 6 hiking-opt14 7 → 6 miconic 2 movie 5 → 4 nomystery-opt11 6 → 1 organic-synthesis-opt18 8 → 4 psr-small 8 → 5 rovers 5 scanalyzer-08 5 scanalyzer-opt11 5 → 4 storage 5 → 4 tpp 6 → 4 transport-opt08 8 → 7 vistall-opt11 4 zenotravel Significant lower complexity considering only reachable states Also to detect unsolvable states ◮ Maximum dimension needed to detect unsolvable states was 3 10/13

  13. Computing a (Quasi-)Perfect Potential Heuristic How close can we get with features of limited size? Minimal Error for Finite Values: ◮ Starts with an “empty” potential heuristic ◮ Iteratively selects feature minimizing the error of the heuristic ◮ Once no feature up to size n reduces the error, add features of size n + 1 to feature pool 11/13

  14. Results Remaining error per feature added. (One line per instance.) 10 5 10 4 Error 10 3 10 2 10 1 10 0 10 1 10 2 10 3 10 4 # of features selected 12/13

  15. Results Remaining error per feature added. (One line per instance.) 10 5 10 4 Error 10 3 10 2 10 1 10 0 10 1 10 2 10 3 10 4 # of features selected Only a few features of a given size are very important 12/13

  16. Conclusion Recap ◮ We investigated the “shape” of h ∗ in several domains ◮ Bad news: Even easy domains need perfect potential heuristics with high dimension ◮ Good news: Only a small number of large features already reduce the heuristic error significantly Open Question ◮ How to automatically identify an informative subset of high-dimensional features? ◮ We could find good weights using an FPT algorithm 13/13

  17. Conclusion Recap ◮ We investigated the “shape” of h ∗ in several domains ◮ Bad news: Even easy domains need perfect potential heuristics with high dimension ◮ Good news: Only a small number of large features already reduce the heuristic error significantly Open Question ◮ How to automatically identify an informative subset of high-dimensional features? ◮ We could find good weights using an FPT algorithm 13/13

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend