Directed Model Checking (not only) for Timed Automata Sebastian - - PowerPoint PPT Presentation

directed model checking not only for timed automata
SMART_READER_LITE
LIVE PREVIEW

Directed Model Checking (not only) for Timed Automata Sebastian - - PowerPoint PPT Presentation

Directed Model Checking (not only) for Timed Automata Sebastian Kupferschmid March, 2010 Model Checking Motivation Embedded Systems Omnipresent Safety relevant systems Pentium bug Ariane 5 Errors can be extremely harmful Correct


slide-1
SLIDE 1

Directed Model Checking (not only) for Timed Automata

Sebastian Kupferschmid March, 2010

slide-2
SLIDE 2

Model Checking

Motivation

Embedded Systems

Omnipresent Safety relevant systems

Pentium bug Ariane 5

Errors can be extremely harmful Correct functioning is absolutely mandatory

slide-3
SLIDE 3

Model Checking

Correct Systems

Every system state satisfies invariant

M, s0 | = ∀ϕ full state space

Erroneous Systems

Find error states fast Short error traces

M, s0 | = ∃♦¬ϕ

Directed Model Checking

Combination of Artificial Intelligence and Model Checking Accelerate the search to error states with heuristic functions

slide-4
SLIDE 4

Outline

Introduction

Timed Automata Directed Model Checking

Coming up with Heuristics in a Principled Way

Pattern Database Heuristics Pattern selection strategies

Summary

Empirical evaluation of several heuristics Literature

slide-5
SLIDE 5

Timed Automata

Syntax

Definition (Timed Automaton)

A timed automaton A is a tuple L, l0, E, X, V, Σ, I, where L finite set of locations, l0 ∈ L the initial location, X finite set of clocks, V finite set of integer variables, Σ synchronization symbols, E finite set of edges, and I assigns invariants to locations.

s0 s1 x ≤ 1 s2 x ≤ 1 c? x := 0 c? x < 1 x ≥ 1

slide-6
SLIDE 6

Timed Automata

Semantics

Semantics

States assign values to

Automata, Integer variables, and Clocks

Transitions

Discrete Delay

infinite transition system

A possible Behavior

s0 s1 x ≤ 1 s2 x ≤ 1 c? x := 0 c? x < 1 x ≥ 1 time x 1 2 3 1 s0 s1 s2 s0

slide-7
SLIDE 7

The Zone Graph

Symbolic State Space

The Zone Graph

Finite & exact abstraction of the timed automata semantics A symbolic state corresponds to a set of states that have the same discrete part and the clock values satisfy a conjunction

  • f clock constraints, a so called zone

s0 s1 x ≤ 1 s2 x ≤ 1 c? x := 0 c? x < 1 x ≥ 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . s0, x = 0 s0, x ≥ 0 s0, x < 1 s1, x = 0 s2, x < 1 s0, x ≥ 1 s0, x = 1 s1, x ≤ 1 s2, x ≤ 1 s0, x ≤ 1

slide-8
SLIDE 8

Model Checking Task

Definition (Model Checking Task)

A model checking task T is a tuple M, ϕ, where M = A1 . . . An is a system of timed automata ϕ is an error formula

slide-9
SLIDE 9

Directed Model Checking

Objective in DMC

Given: a model checking task T = M, ϕ with corresponding symbolic state space S(M) = S, s0, T Find: a sequence π = s0

t1

− → s1

t2

− → . . . sn−1

tn

− → sn, where si ∈ S, si

ti

− → si+1 ∈ T, and sn | = ϕ Approach: informed search algorithm heuristic function

slide-10
SLIDE 10

Directed Model Checking

Model Checking + Heuristic Search

Definition (heuristic function)

Let T = M, ϕ be a model checking task and let S(M) = S, s0, T be the state space of M. A heuristic function (or heuristic) is a function h : S → N0 ∪ {∞}. The heuristic estimate h(s) for a state s ∈ S is supposed to estimate the distance from s to the nearest error state.

slide-11
SLIDE 11

Heuristic Search

The General Idea

init error d i s t a n c e e s t i m a t e distance estimate distance estimate d i s t a n c e e s t i m a t e

slide-12
SLIDE 12

The Properties of Heuristics

Definition (perfect heuristic)

Let T = M, ϕ and let S(M) = S, s0, T. The perfect heuristic

  • f S(M) is the heuristic h∗ which maps each state s ∈ S to the

length of a shortest path from s to any error state. Note: h∗(s) = ∞ iff no error state is reachable from s. Heuristic h is called admissible if h(s) ≤ h∗(s) for all states s ∈ S safe if h∗(s) = ∞ for all s ∈ S with h(s) = ∞ goal-aware if h(s) = 0 for all error states s ∈ S consistent if h(s) ≤ h(s′) + 1 for all nodes s, s′ ∈ S

  • s. t. s → s′ ∈ T
slide-13
SLIDE 13

A Generic Informed Search Algorithm

1 function dmc(M, ϕ, h): 2

  • pen = empty priority queue

3 closed = ∅ 4

  • pen.insert(s0, priority(s0, h))

5 while open = ∅ do: 6 s = open.getMinimum() 7 if s | = ϕ then: 8 return True 9 if s ∈ closed then: 10 closed = closed ∪ {s} 11 for each s′ ∈ succs(s) do: 12

  • pen.insert(s′, priority(s′, h))

13 return False

slide-14
SLIDE 14

Heuristic Search Methods

A∗ Search

priority(s, h) = depth(s) + h(s) If h is admissible shortest possible error traces Often high memory consumption

Greedy Search

priority(s, h) = h(s) Expands fewer states than A∗ in practice No guarantee on error trace length

slide-15
SLIDE 15

Dominance

Definition (Dominance)

Let h, h′ be two admissible heuristics. The heuristic h dominates h′ iff ∀s ∈ S : h(s) > h′(s)

Theorem

Let h, h′ be two admissible heuristics. If h dominates h′, then every state explored by A∗ with h is also explored by A∗ with h′.

slide-16
SLIDE 16

Heuristics for Directed Model Checking

Requirements for h

  • 1. Accurate (with respect to h∗)

“The closer the better” It has to work well in practice

  • 2. Efficiently computable for any state s

Heuristic has to be computed for every encountered state Efficient = low-order polynomial in T

  • 3. Derived automatically for a given model checking task

Based on the declarative description of T No user interaction

slide-17
SLIDE 17

A Simple Heuristic for Directed Model Checking

Hamming Distance Heuristic

The minimal number of variable values that have to be changed in

  • rder to turn s into an error state e.

h(s) = min

e∈S:e| =ϕ #different values(s, e)

Intuition

The more similar to an error state the closer to an error state.

slide-18
SLIDE 18

Criticism of the Hamming Distance Heuristic

What is wrong with the Hamming distance heuristic? Quite uninformative: the range of heuristic values is small; typically, most successors have the same estimate Sensitive to reformulation: can easily transform any MC task into an equivalent one where h(s) = 1 for all non-error states (how?) Ignores almost all problem structure: heuristic values do not depend on the set of transitions! need a better, principled way of coming up with heuristics

slide-19
SLIDE 19

Coming up with Heuristics in a Principled Way

In this Lecture: Pattern Database Heuristics

State-of-the-art heuristics Based on abstractions Fully automatically generated No user interaction Applicable to a wide range of transition systems

slide-20
SLIDE 20

A Design Principle for Heuristics

The General Idea

Given

A model checking task T = M, ϕ with Corresponding state space S(M) = S, s0, T

A Generic Approach for Obtaining Heuristics

Select an overapproximation T α of T with T α = Mα, ϕα and S(Mα) = Sα, sα

0 , T α

For every state s ∈ S encountered during the search

Find a (shortest) error trace π in Sα, sα, T α h(s) = |π|

slide-21
SLIDE 21

A Design Principle for Heuristics

The General Idea

Original Transition System Overapproximation

slide-22
SLIDE 22

A Design Principle for Heuristics

The General Idea

Original Transition System Overapproximation

s sα h(s) = 2

slide-23
SLIDE 23

Pattern Database (PDB) Heuristics

Prior to Search

Choose an abstraction α For every abstract state sα ∈ S(Mα) = Sα, sα

0 , T α

Compute abstract error distance distα(sα) Store sα, distα(sα) in lookup table (the pattern database)

During Search

Map state s to corresponding abstract state sα Heuristic value: h(s) = d(sα)

slide-24
SLIDE 24

How to Choose the Abstraction?

The Original State Space

slide-25
SLIDE 25

How to Choose the Abstraction?

The Trivial Abstraction

slide-26
SLIDE 26

How to Choose the Abstraction?

The Identity Abstraction

slide-27
SLIDE 27

How to Choose the Abstraction?

The Perfect Abstraction

slide-28
SLIDE 28

Conflicting Requirements

Requirements for the Heuristic

Informativeness (quality) Has to work well in practice

Requirements for the Abstraction

Efficient to compute Not too many abstract states Succinct representation (memory requirement) Question: where is the sweet-spot?

slide-29
SLIDE 29

Two Different Abstraction Classes

Predicate Abstraction

Abstract state space defined by a set of selected predicates Use SAT or SMT to construct abstract state space Fine-grained

Variable Abstraction

Special case of predicate abstraction Ignores subset of the system’s variables Abstract model in same formalism (can be constructed with the same tool, often more efficient than general purpose SAT solvers)

slide-30
SLIDE 30

Pattern Selection

What kind of pattern shall we use?

Definition (Pattern)

A pattern is a set of variables/predicates used to define a system.

In this Lecture

Cone-of-influence-based pattern selection Pattern selection using counterexamples Syntax-based pattern selection A local search approach

slide-31
SLIDE 31

Pattern Selection for Variable Abstractions

Pattern P

Subset of the variables that are used to define the system

  • e. g., clocks, automata, synchronization labels, . . .

Abstraction of M with respect to P = {P, y, c, g}

M = P Q

left P walk right c! g? x := 0 x > 2 red Q yellow y ≤ 1 green c? y := 0 g! y ≥ 2

Mα = P α

left P α walk right c! g? g? x := 0 x > 2 red yellow y ≤ 1 green c? y := 0 g! y ≥ 2

slide-32
SLIDE 32

Patterns and Overapproximations

But: P = {P, y, c, g} does not induce an overapproximation! Why . . .

left walk right c! g? x > 2 g? x := 0 red yellow y ≤ 1 green c? y := 0 g! y ≥ 2 left walk right c! g? x > 2 g? x := 0 red yellow y ≤ 1 green c? y := 0 g! y ≥ 2 Q P P α Qα

slide-33
SLIDE 33

Patterns and Overapproximations

But: P = {P, y, c, g} does not induce an overapproximation! Why . . .

left walk right c! g? x > 2 g? x := 0 red yellow y ≤ 1 green c? y := 0 g! y ≥ 2 left walk right c! g? x > 2 g? x := 0 red yellow y ≤ 1 green c? y := 0 g! y ≥ 2 Q P P α Qα

. . . because P α = walk is not reachable (synchronization)

slide-34
SLIDE 34

Closure of Patterns

Definition (closed pattern)

A pattern P is closed iff {b | ∃a ∈ P : a depends on b} ⊆ P

Consequences

Closed patterns overapproximation Overapproximation: all error paths are preserved Overapproximation admissible heuristics Note: every abstraction set can be closed

slide-35
SLIDE 35

Cone-of-influence-based Pattern Selection

A COI-based Method

Given: a model checking task T = M, ϕ and a bound b ∈ N0. Return: P = b

i=0 Pi

Where: P0 = vars(ϕ) Pi+1 = {v | ∃v′ ∈ Pi : v can influence v′}

slide-36
SLIDE 36

Illustrating Example

Cone of Influence

Let T = M, P = right be a model checking task and let b = 1, M = P Q

left P walk right c! g? x := 0 x > 2 red Q yellow y ≤ 1 green c? y := 0 g! y ≥ 2

Cone of Influence

P c g Q y x

slide-37
SLIDE 37

Illustrating Example

Cone of Influence

Let T = M, P = right be a model checking task and let b = 1, M = P Q

left P walk right c! g? x := 0 x > 2 red Q yellow y ≤ 1 green c? y := 0 g! y ≥ 2

Cone of Influence

P c g Q y x P =

then P = {P, c, g, x}.

slide-38
SLIDE 38

COI-based Pattern Selection

Concluding Comments

User interaction (bound b) Small values of b uninformed heuristic Larger values of b quickly converges towards original system Can be difficult to select good values for b

slide-39
SLIDE 39

Counterexample-based Pattern Selection

The Method

Use the monotonicity abstraction Compute abstract error trace for the abstract MC problem Relevant variables: all variables that occur in the declarative description of the transitions that are involved in the abstract error trace

Pattern P

Contains all relevant variables No user interaction

slide-40
SLIDE 40

The Monotonicity Abstraction

Adaptation of “Ignoring negative Effects”

Idea

Abstract variables are set-valued A variable, once it obtained a value keeps that value forever

Variables in the Abstraction

v ∈ dom(v) v+ ⊆ dom(v) v := w v+ := v+ ∪ w+

Clocks in the Abstraction

Trivialize very fast ignored in the abstraction

slide-41
SLIDE 41

The Monotonicity Abstraction

Computing Abstract Error Traces

l1 P l2 l3 v ≤ 0 v := v + 1 w : = 3

Initial state: P = l1, v = 0, w = 0 Error formula: ϕ = (v = 2)

Computation of Abstract Error Traces

  • 1. Simultanios execution of all enabled transitions

P += {l1}, v+ = {0}, w+= {0}

slide-42
SLIDE 42

The Monotonicity Abstraction

Computing Abstract Error Traces

l1 P l2 l3 v ≤ 0 v := v + 1 w : = 3

Initial state: P = l1, v = 0, w = 0 Error formula: ϕ = (v = 2)

Computation of Abstract Error Traces

  • 1. Simultanios execution of all enabled transitions

P += {l1}, v+ = {0}, w+= {0} P += {l1, l2, l3}, v+ = {0, 1}, w+= {0, 3} l1 → l2 l1 → l3

slide-43
SLIDE 43

The Monotonicity Abstraction

Computing Abstract Error Traces

l1 P l2 l3 v ≤ 0 v := v + 1 w : = 3

Initial state: P = l1, v = 0, w = 0 Error formula: ϕ = (v = 2)

Computation of Abstract Error Traces

  • 1. Simultanios execution of all enabled transitions

P += {l1}, v+ = {0}, w+= {0} P += {l1, l2, l3}, v+ = {0, 1}, w+= {0, 3} P += {l1, l2, l3}, v+ = {0, 1, 2}, w+= {0, 3} l1 → l2 l1 → l2 l1 → l3 l1 → l3

slide-44
SLIDE 44

The Monotonicity Abstraction

Computing Abstract Error Traces

l1 P l2 l3 v ≤ 0 v := v + 1 w : = 3

Initial state: P = l1, v = 0, w = 0 Error formula: ϕ = (v = 2)

Computation of Abstract Error Traces

  • 1. Simultanios execution of all enabled transitions
  • 2. Remove unnecessary transitions

P += {l1}, v+ = {0}, w+= {0} P += {l1, l2, l3}, v+ = {0, 1}, w+= {0, 3} P += {l1, l2, l3}, v+ = {0, 1, 2}, w+= {0, 3} l1 → l2 l1 → l2 l1 → l3 l1 → l3

slide-45
SLIDE 45

The Monotonicity Abstraction

Computing Abstract Error Traces

l1 P l2 l3 v ≤ 0 v := v + 1 w : = 3

Initial state: P = l1, v = 0, w = 0 Error formula: ϕ = (v = 2)

Computation of Abstract Error Traces

  • 1. Simultanios execution of all enabled transitions
  • 2. Remove unnecessary transitions

P += {l1}, v+ = {0}, w+= {0} P += {l1, l2, l3}, v+ = {0, 1}, w+= {0, 3} P += {l1, l2, l3}, v+ = {0, 1, 2}, w+= {0, 3} l1 → l2 l1 → l2 l1 → l3 l1 → l3

slide-46
SLIDE 46

The Monotonicity Abstraction

Computing Abstract Error Traces

l1 P l2 l3 v ≤ 0 v := v + 1 w : = 3

Initial state: P = l1, v = 0, w = 0 Error formula: ϕ = (v = 2)

Computation of Abstract Error Traces

  • 1. Simultanios execution of all enabled transitions
  • 2. Remove unnecessary transitions

P += {l1}, v+ = {0}, w+= {0} P += {l1, l2, l3}, v+ = {0, 1}, w+= {0, 3} P += {l1, l2, l3}, v+ = {0, 1, 2}, w+= {0, 3} l1 → l2 l1 → l2 l1 → l3 l1 → l3

slide-47
SLIDE 47

The Monotonicity Abstraction

Computing Abstract Error Traces

l1 P l2 l3 v ≤ 0 v := v + 1 w : = 3

Initial state: P = l1, v = 0, w = 0 Error formula: ϕ = (v = 2)

Computation of Abstract Error Traces

  • 1. Simultanios execution of all enabled transitions
  • 2. Remove unnecessary transitions

P += {l1}, v+ = {0}, w+= {0} P += {l1, l2, l3}, v+ = {0, 1}, w+= {0, 3} P += {l1, l2, l3}, v+ = {0, 1, 2}, w+= {0, 3} l1 → l2 l1 → l2 l1 → l3 l1 → l3

l1

w:=3

− − − → l3 does not occur in the abstract error trace w ∈ P

slide-48
SLIDE 48

Concluding Comments

Where does it work, where not

Works well for modular systems with little interaction (many real-world applications) Problems with systems with tight interaction identity abstraction

slide-49
SLIDE 49

Pattern Selection for Predicate Abstractions

Abstract State Space

P = {p1, . . . , pn} set of predicates that “talk” about the variables of the system Abstract states b assign each p ∈ P a truth value (can be represented as bitstrings) An abstract state b corresponds to the set of concrete states [b] = {s | s | = b} There is an abstract transition t = b → b′ iff ∃s ∈ [b] and ∃s′ ∈ [b′] such that s → s′ is a concrete transition

slide-50
SLIDE 50

Syntax-based Pattern Selection

Pattern P

Set of predicates containing: All constraints that appear in guards or location invariants For each location: a location predicate

Example

left P walk right c! g? x := 0 x > 2 red Q yellow y ≤ 1 green c? y := 0 g! y ≥ 2

P = {P = left, P = walk, P = right, Q = red, Q = yellow, Q = green, x > 2, y ≤ 1, y ≥ 2}

slide-51
SLIDE 51

Mapping Concrete to Abstract States

Predicate Abstraction

Mapping Concrete to Abstract States

Let P = {p1, . . . pn} be a set of predicates. For every p ∈ P check if s | = p abstract state sα Looking up abstract states is straight-forward

slide-52
SLIDE 52

Concluding Comments

Syntax-based Pattern Selection

Can induce large pattern databases (many predicates) This can be overcome by “splitting” the system into several independent parts Construct PDB for each of these parts and Combine (maximize or add) heuristic values

slide-53
SLIDE 53

A Local Search Approach to Pattern Selection

Local search in the pattern space

Given: the set of variables V used to define a system and a threshold for the maximum size of the PDB Start with pattern P = ∅ While |P| < threshold do

Select v ∈ V \ P such that P′ = P ∪ {v} is better than P′′ = P ∪ {w} for all w = v P = P′

∅ {v1} {vi} {vi, v1} {vi, vj} {vi, vn} {vn}

Question: how can we measure the quality of a pattern?

slide-54
SLIDE 54

Evaluating Patterns

Estimating a PDB heuristic’s quality

A possible quality measurement

Average heuristic value ¯ h of the PDB heuristic h induced by the pattern Intuition: if h dominates h′, then ¯ h > ¯ h′

Problems

What if there are dead ends that h can detect? (how to cope with ∞?) For the evaluation the PDB has to be constructed (can be expensive)

slide-55
SLIDE 55

Mapping Concrete to Abstract States

Variable Abstraction

Symbolic state space S

Discrete part d Zone Z

s = d, Z

  • 1. Sα in hash table

sα = dα, Zα Bucket: states with equal discrete part

dA, Z1 dA, Z2 dA, Z3

  • 2. Find Abstract States

s′ = dα, Z′ with Z′ ∩ Zα = ∅

ZA Z1 Z2 Z3 ZA Z1 Z2 Z3

  • 3. Heuristic Value

h(s) = min

s′∈Sα{distα(s′) | s′ ∩ sα = ∅}

slide-56
SLIDE 56

The Big Question

So far, we have seen

Different abstractions Different approaches for pattern selection

slide-57
SLIDE 57

The Big Question

So far, we have seen

Different abstractions Different approaches for pattern selection

But . . .

Which approach works best?

slide-58
SLIDE 58

Empirical Evaluation

Experimental Setup

2.66 GHz Intel Xeon, memout at 4 GB Implemented either in Mcta or Uppaal/DMC

Single-tracked Line Segment (flawed version)

  • PLC 1

PLC 2 ES1 CS1 LS1 ES2 LS2 CS2

slide-59
SLIDE 59

Empirical Results

A∗ Search

explored states C1 C2 C3 C4 C5 C6 C7 C8 C9 103 104 105 106 107 108 runtime in s C1 C2 C3 C4 C5 C6 C7 C8 C9 0.1 1 10 100 103 104

Edelkamp et al. Based on plain graph distance Kupferschmid et al. Monotonicity abstraction Dr¨ ager et al. Iteratively “merges” two automata by abstracting their cross product Hoffmann et al. PDB heuristic: syntax-based pattern selection Qian et al. PDB heuristic: COI-based pattern selection, user interaction Kupferschmid et al. PDB heuristic: CE-based pattern selection

slide-60
SLIDE 60

Empirical Results

Greedy Search

explored states C1 C2 C3 C4 C5 C6 C7 C8 C9 103 104 105 106 107 108 runtime in s C1 C2 C3 C4 C5 C6 C7 C8 C9 0.1 1 10 100 103 104 error trace length C1 C2 C3 C4 C5 C6 C7 C8 C9 100 103 104 105 106 107

Edelkamp et al. Kupferschmid et al. (monotonicity abstraction) Dr¨ ager et al. Hoffmann et al. Qian et al. Kupferschmid et al. (CE-based pattern selection)

slide-61
SLIDE 61

Literature

About this List

This list is meant to be focused, not comprehensive. Hence, it is a somewhat subjective mix of papers we consider important and relevant to the lecture’s topic. If a paper is not listed, there are many possible reasons:

We do not know it. We forgot it. We do not think it is (sufficiently) important. It overlaps considerably with another paper listed here. Its topic is not close enough to the focus of this lesson (e. g., papers on domain-dependent search).

slide-62
SLIDE 62

References: Directed Model Checking

◮ C. Han Yang and David L. Dill.

Validation with guided search of the state space. In Proc. Conference on Design Automation, pp. 599–604, 1998. First paper about MC + heuristic search

◮ Stefan Edelkamp, Alberto Lluch-Lafuente, and Stefan Leue.

Directed explicit model checking with HSF-SPIN. In Proc. SPIN 2001, pp. 57–79, 2001. Coined the term directed model checking

◮ Judea Pearl.

Heuristics: Intelligent Search Strategies for Computer Problem Solving. Addison-Wesley, 1984. Discusses the foundations of heuristic search

slide-63
SLIDE 63

References: Pattern Database Heuristics

◮ Joseph C. Culberson and Jonathan Schaeffer.

Pattern databases. Computational Intelligence, 14(3):318–334, 1998. First paper on pattern database heuristics

◮ Stefan Edelkamp.

Symbolic pattern databases in heuristic search planning. In Proc. AIPS 2002, pp. 274–283, 2002. Uses BDDs to store pattern databases more compactly.

◮ Kairong Qian and Albert Nymeyer.

Guided invariant model checking based on abstraction and symbolic pattern databases. In Proc. TACAS 2004, pp. 497–511, 2004. COI-based pattern selection

slide-64
SLIDE 64

References: Pattern Database Heuristics (ctd.)

◮ Stefan Edelkamp.

Automated creation of pattern database search heuristics. In Proc. MOCHART 2006, pp. 35–50, 2007. First search-based pattern selection method.

◮ J¨

  • rg Hoffmann, Jan-Georg Smaus, Andrey Rybalchenko, Sebastian

Kupferschmid, and Andreas Podelski. Using predicate abstraction to generate heuristic functions in Uppaal. In Proc. MOCHART 2006, pp. 51–66, 2007. Uses predicate abstraction to generate PDB heuristics

◮ Sebastian Kupferschmid, J¨

  • rg Hoffmann, and Kim G. Larsen.

Fast directed model checking via russian doll abstraction. In Proc. TACAS 2008, pp. 203–217, 2008. Introduces CE-based pattern selection

slide-65
SLIDE 65

References: Other Heuristics

◮ Sebastian Kupferschmid, J¨

  • rg Hoffmann, Henning Dierks, and

Gerd Behrmann. Adapting an AI planning heuristic for directed model checking. In Proc. SPIN 2006, pp. 35–52, 2006. Introduces the monotonicity abstraction (for model checking)

◮ Klaus Dr¨

ager, Bernd Finkbeiner, and Andreas Podelski. Directed model checking with distance-preserving abstractions. International Journal on Software Tools for Technology Transfer, 11(1):27–37, 2009. Introduces distance-preserving abstractions

slide-66
SLIDE 66

References: Other Heuristics (ctd.)

◮ Martin Wehrle and Malte Helmert.

The causal graph revisited for directed model checking. In Proc. SAS 2009, pp. 86–101, 2009. Adapts the causal graph heuristic from AI planning

◮ Martin Wehrle, Sebastian Kupferschmid, and Andreas Podelski.

Transition-based directed model checking. In Proc. TACAS 2009, pp 186–200, 2009. General framework to accelerate heuristic search

slide-67
SLIDE 67

References: DMC Tools for Timed Automata

◮ Sebastian Kupferschmid, Klaus Dr¨

ager, J¨

  • rg Hoffmann, Bernd

Finkbeiner, Henning Dierks, Andreas Podelski, and Gerd Behrmann. Uppaal/DMC – abstraction-based heuristics for directed model checking. In Proc. TACAS 2007, pp. 679–682, 2007. DMC extension of Uppaal

◮ Sebastian Kupferschmid, Martin Wehrle, Bernhard Nebel, and

Andreas Podelski. Faster than Uppaal? In Proc. CAV 2008, pp. 552–555, 2008. Open source directed model checker for timed automata