Higher-Dimensional Potential Heuristics for Optimal Classical - - PowerPoint PPT Presentation

higher dimensional potential heuristics for optimal
SMART_READER_LITE
LIVE PREVIEW

Higher-Dimensional Potential Heuristics for Optimal Classical - - PowerPoint PPT Presentation

Introduction Compact Characterizations Larger Features Higher-Dimensional Potential Heuristics for Optimal Classical Planning Florian Pommerening 1 Malte Helmert 1 Blai Bonet 2 1 University of Basel, Switzerland 2 Universidad Sim on Bol


slide-1
SLIDE 1

Introduction Compact Characterizations Larger Features

Higher-Dimensional Potential Heuristics for Optimal Classical Planning

Florian Pommerening1 Malte Helmert1 Blai Bonet2

1University of Basel, Switzerland 2Universidad Sim´

  • n Bol´

ıvar, Venezuela

February 7, 2017

slide-2
SLIDE 2

Introduction Compact Characterizations Larger Features

Higher-Dimensional Potential Heuristics for Optimal Classical Planning

slide-3
SLIDE 3

Introduction Compact Characterizations Larger Features

Higher-Dimensional Potential Heuristics for Optimal Classical Planning Find cheapest action sequence to achieve a goal. States are variable assignments. Operators change variable values.

slide-4
SLIDE 4

Introduction Compact Characterizations Larger Features

Higher-Dimensional Potential Heuristics for Optimal Classical Planning h(s) =

  • f∈F

w(f)[s | = f] Weighted sum of state features Two choices

Which features to use? How to find good weights?

slide-5
SLIDE 5

Introduction Compact Characterizations Larger Features

Higher-Dimensional Potential Heuristics for Optimal Classical Planning Features are conjunctions of facts Size of a feature: number of conjuncts

“Atomic” features (size 1) w(at-A) = 10, w(at-B) = 5 “Binary” features (size 2) w(at-B ∧ door-locked) = 10 . . .

slide-6
SLIDE 6

Introduction Compact Characterizations Larger Features

Why do we care about higher-dimensional features?

Initial heuristic values

20 40 20 40 Atomic features Binary features 20 40 20 40 Binary features Perfect heuristic

Atomic features are often not sufficient for high-quality heuristics

slide-7
SLIDE 7

Introduction Compact Characterizations Larger Features

Goals

Find good weights automatically Ideally:

Declare properties of heuristics (admissible, consistent) Constraints characterize heuristics with these properties Select best possible heuristic from the space of solutions

slide-8
SLIDE 8

Introduction Compact Characterizations Larger Features

Our Contributions

Describing admissible and consistent potential heuristics Features Characterization All atomic features compact [previous work] All binary features compact [new] All ternary features intractable [new] Subset of all features fixed parameter tractable [new] Also in the paper Potential functions ≃ Transition cost partitioning

slide-9
SLIDE 9

Introduction Compact Characterizations Larger Features

Compact Characterizations

slide-10
SLIDE 10

Introduction Compact Characterizations Larger Features

Compact Characterization

Characterizing admissible and consistent heuristics Goal awareness h(s∗) ≤ 0 Easy: h(s∗) is a sum of weights Consistency h(s) − h(s′) ≤ cost(o) ∀s o − → s′ Hard: exponential number of constraints

slide-11
SLIDE 11

Introduction Compact Characterizations Larger Features

Consistency

Consider a single operator Three types of features

irrelevant: no variables in common with o context-independent: all variables in common with o context-dependent: some in common with o, some not

Heuristic difference caused by operator o h(s) − h(s′) = ∆irr

  • (s) + ∆ind
  • (s) + ∆dep
  • (s)
slide-12
SLIDE 12

Introduction Compact Characterizations Larger Features

Heuristic Difference when Applying Operator o

Consistency for an operator o ∆irr

  • (s)0 + ∆ind
  • (s) + ∆dep
  • (s) ≤ cost(o)

∀s o − → s′ Irrelevant features No variables in common with o No change in truth value when applying o Does not cause change in heuristic

slide-13
SLIDE 13

Introduction Compact Characterizations Larger Features

Heuristic Difference when Applying Operator o

Consistency for an operator o ∆irr

  • (s)0 + ∆ind
  • (s) + ∆dep
  • (s) ≤ cost(o)

∀s o − → s′ Irrelevant features No variables in common with o No change in truth value when applying o Does not cause change in heuristic

slide-14
SLIDE 14

Introduction Compact Characterizations Larger Features

Heuristic Difference when Applying Operator o

Consistency for an operator o ∆irr

  • (s)0 + ∆ind
  • (s) + ∆dep
  • (s) ≤ cost(o)

∀s o − → s′ Irrelevant features No variables in common with o No change in truth value when applying o Does not cause change in heuristic

slide-15
SLIDE 15

Introduction Compact Characterizations Larger Features

Heuristic Difference when Applying Operator o

Consistency for an operator o ∆irr

  • (s)0 + ∆ind
  • (s) + ∆dep
  • (s) ≤ cost(o)

∀s o − → s′ Context-independent features All variables in common with o Change in truth value fully determined by o Heuristic change easy to specify and does not depend on state

slide-16
SLIDE 16

Introduction Compact Characterizations Larger Features

Heuristic Difference when Applying Operator o

Consistency for an operator o ∆irr

  • (s)0 + ∆ind
  • (s) + ∆dep
  • (s) ≤ cost(o)

∀s o − → s′ Context-independent features All variables in common with o Change in truth value fully determined by o Heuristic change easy to specify and does not depend on state

slide-17
SLIDE 17

Introduction Compact Characterizations Larger Features

Heuristic Difference when Applying Operator o

Consistency for an operator o ∆irr

  • (s)0 + ∆ind
  • (s) + ∆dep
  • (s) ≤ cost(o)

∀s o − → s′ Context-independent features All variables in common with o Change in truth value fully determined by o Heuristic change easy to specify and does not depend on state

slide-18
SLIDE 18

Introduction Compact Characterizations Larger Features

Heuristic Difference when Applying Operator o

Consistency for an operator o ∆irr

  • (s)0 + ∆ind
  • (s) + ∆dep
  • (s) ≤ cost(o)

∀s o − → s′ Context-dependent features At least one variable in common with o At least one variable not mentioned by o Heuristic change depends on state

slide-19
SLIDE 19

Introduction Compact Characterizations Larger Features

Context-Dependent Features

Context-Dependent Features Atomic features: no context-dependent features Binary features: context limited to one variable

“Worst value” exists for each variable Worst case: all variables have worst value Constraint for worst state implies all other constraints

Theorem Admissible and consistent potential heuristics over binary features can be characterized by a compact set of linear constraints.

slide-20
SLIDE 20

Introduction Compact Characterizations Larger Features

Larger Features

slide-21
SLIDE 21

Introduction Compact Characterizations Larger Features

Intractability

In general Change in potential when applying o depends on more than one variable Influence of V on o depends on larger context Theorem Testing whether a given potential function is consistent is coNP-complete. This already holds with only ternary features. Proof: Reduction from non-3-colorability

slide-22
SLIDE 22

Introduction Compact Characterizations Larger Features

Fixed Parameter Tractbility

Approach for binary features can be generalized Factor out influence of one variable at a time Generalization of Bucket Elimination algorithm from numerical cost functions to linear expressions Theorem Computing a set of linear constraints that characterize the admissible and consistent potential heuristics for a set of features is fixed-parameter tractable. Parameter: tree-width of feature connectivity.

slide-23
SLIDE 23

Introduction Compact Characterizations Larger Features

Take Home Messages

slide-24
SLIDE 24

Introduction Compact Characterizations Larger Features

Take Home Messages

Characterization of admissible and consistent potential functions Compact for binary features coNP-complete for ternary or larger features . . . . . . but fixed parameter tractable Parameter: tree-width of feature connectivity