Ackermann-Hardness for Lossy Counter Machines (and Reset Petri Nets) - - PowerPoint PPT Presentation

ackermann hardness for lossy counter machines and reset
SMART_READER_LITE
LIVE PREVIEW

Ackermann-Hardness for Lossy Counter Machines (and Reset Petri Nets) - - PowerPoint PPT Presentation

Ackermann-Hardness for Lossy Counter Machines (and Reset Petri Nets) Philippe Schnoebelen LSV, CNRS & ENS Cachan + Oxford 1-year visitor QM EECSTCS Seminar, London, June 20th 2012 Part I: Lossy counter machines 2/20 C OUNTER M ACHINES


slide-1
SLIDE 1

Ackermann-Hardness for Lossy Counter Machines (and Reset Petri Nets)

Philippe Schnoebelen LSV, CNRS & ENS Cachan + Oxford 1-year visitor QM EECS–TCS Seminar, London, June 20th 2012

slide-2
SLIDE 2

Part I: Lossy counter machines

2/20

slide-3
SLIDE 3

COUNTER MACHINES

Finite state control + finite number of “counters” (say m) + simple instructions and tests

ℓ0 ℓ1 ℓ2 ℓ3 c1++ c2>0? c2-- c3=0? 4 c2 1 c1 c3

Operational semantics: – Configurations: Conf

def

= Loc × NC = {s,t,...}, e.g., s0 = (ℓ0,1,4,0) – Steps: (ℓ0,1,4,0) − → (ℓ1,2,4,0) − → (ℓ2,2,3,0) − → (ℓ3,2,3,0) − → ··· A well-known model, Turing-powerful as soon as there are 2 counters

3/20

slide-4
SLIDE 4

LCM = LOSSY COUNTER MACHINES

LCM = Counter machines with unreliability: “counters decrease nondeterministically” [R. Mayr, TCS 2003] (Weaker) computational model useful, e.g., for logics like XPath or LTL+data. See decidability survey in [S., RP 2010].

  • Semantics. Reliable steps: s −

→rel t as above Lossy steps: s − → t

def

⇔ s s′ − →rel t′ t for some s′ and t′ where s = (ℓ,a1,...,am) (ℓ′,b1,...,bm) = s′ def ⇔ ℓ = ℓ′ ∧ a1 b1 ∧ ... ∧ am bm

  • Prop. [Monotony] s +

− → t implies s′ + − → t′ for all s′ s and t′ t

  • NB. (Conf,) is a Well-Quasi-Ordering (a WQO) hence LCM’s are

well-structured

4/20

slide-5
SLIDE 5

Part II: Well-quasi-orderings and the length of bad sequences

5/20

slide-6
SLIDE 6

WQO: WELL-QUASI-ORDERINGS

(X,) is a well-quasi-ordering (a WQO) if any infinite sequence x0,x1,x2 ... over X contains an increasing pair xi xj (for some i < j) Examples.

  • 1. (Nk,prod) is a WQO (Dickson’s Lemma)

where, e.g., 3,2,1 5,2,2 but 1,2,3 5,2,2

  • 2. (Σ∗,⊑) is a WQO (Higman’s Lemma)

where, e.g., abc ⊑ bacbc but cba bacbc Many other examples: (Conf,) for LCM’s, finite trees with tree embedding (Kruskal’s Theorem), graphs ordered as minors (Robertson-Seymour Theorem), .. Systems where steps are monotonic wrt a WQO on configurations, called “well-structured systems” , enjoy generic decidability results [Finkel & S., TCS 2001] My current research program: Algorithmic aspects of WQO-theory & Complexity of WQO-based algorithms

6/20

slide-7
SLIDE 7

LENGTH OF BAD SEQUENCES

  • Def. A sequence x0,x1,... over X is bad

def

⇔ there is no increasing pair xi xj with i < j Now: Over a WQO, a bad sequence is necessarily finite. Complexity upper bounds ≃ “how long can a bad sequence be?” In general, bad sequences over a given WQO can be arbitrarily long. However, controlled bad sequences cannot:

  • Def. x0,x1,... is (g,n)-controlled

def

⇔ |xi| gi(n) Length Function Theorems are results of the form “Any (g,n)-controlled bad sequence x0,x1,...,xl over X has length l LX,g(n)” for some bounding functions LX,g.

7/20

slide-8
SLIDE 8

THE FAST-GROWING HIERARCHY

A.k.a. The (Extended) Grzegorczyk Hierarchy For α = 0,1,2,... define Fα : N → N with: F0(n)

def

= n + 1 (D1) Fα+1(n)

def

= Fn+1

α

(n) =

n+1 times

  • Fα(Fα(...Fα(n)...))

(D2) Fω(n)

def

= Fn(n) ≃ Ackermann(n) (D3) This yields: F1(n) = 2n + 1 F2(n) = (n + 1)2n+1 − 1 and F3(n) > 22 . . .

2

n times F4 is . . . impossible to grasp intuitively (at least for me) Length Function Theorem for Nk. [LICS 2011, ICALP 2011] For primitive-recursive g, the length of (g,n)-controlled bad sequences

  • ver (Nk,) is in Fk+O(1) (and in Fk for small g).

8/20

slide-9
SLIDE 9

Part III: Upper bounds for LCM’s

9/20

slide-10
SLIDE 10

DECIDING TERMINATION FOR LCM’S

(Non-)Termination. There is an infinite run sinit = s0 − → s1 − → s2 ··· iff there is a loop sinit = s0 − → ··· − → sk − → ··· − → sn = sk Hence termination is co-r.e. for LCM’s

  • Furthermore. There is a loop from sinit iff there is a loop that is a bad

sequence (until sn−1)

  • Proof. Assume a length-n loop has an increasing pair si sj for

i < j < n. Then we obtain a shorter loop by replacing sj−1 − → sj by sj−1 − → s′

j = si. Thus the shortest loop has no increasing pair

  • Furthermore. Since necessarily s −

→ t implies |t| |s| + 1, any run is Succ-controlled Hence n LA,Succ(|sinit|) for A ≡ Loc × N|C| ≡ Nm × |Loc|.

  • Cor. Termination of LCM’s can be decided with complexity in Fω, and

in Fm when we fix |C| = m

10/20

slide-11
SLIDE 11

DECIDING REACHABILITY FOR LCM’S

Same ideas work for reachability: “is there a run from sinit to sgoal?”

  • Proof. if a run sinit = s0 −

→ s1 − → ··· − → sn = sgoal has a decreasing pair si sj for 0 < i < j it can be shortened as s0 − → ··· − → si−1 − → sj − → ··· − → sn

  • Cor. If sgoal can be reached from sinit, this can be achieved via a run

that is a (reversed) bad sequence

  • But. How is the reversed run g-controlled for some g?
  • Prop. In the smallest run, |si| |si+1| + 1 for all 0 < i < n
  • Cor. Reachability in LCM’s can be decided with complexity in Fω, or

Fm (same as Termination)

  • Nb. generic technique extends to other problems/models

11/20

slide-12
SLIDE 12

Part IV: Lower Bounds via Simulation of Fast-Growing Functions

12/20

slide-13
SLIDE 13

PROBLEM STATEMENT

We have (rather disgusting) upper bounds on the complexity of verification for lossy counter machines. Do we have matching lower bounds?

  • Answer. Unfortunately yes (see rest of this talk)
  • NB. We mean lower bounds on the decision problems, not just on the

simple algorithms we just saw Reduction stategy for proving lower bounds in lossy systems:

  • 1. Compute unreliably fast-growing functions: Hardy hierarchy
  • 2. Use this as an unreliable computational ressource
  • 3. “Check” in the end that nothing was lost
  • 4. Need computing unreliably the inverses of fast-growing functions

13/20

slide-14
SLIDE 14

FAST-GROWING VS. HARDY HIERARCHY

F0(n)

def

= n + 1 H0(n)

def

= n Fα+1(n)

def

= Fn+1

α

(n) =

n+1 times

  • Fα(Fα(...Fα(n)...))

Hα+1(n)

def

= Hα(n + 1) Fλ(n)

def

= Fλn(n) Hλ(n)

def

= Hλn(n)

  • Prop. Hωα(n) = Fα(n) for all α and n
  • Nb. Hα(n) can be evaluated by transforming a pair

α,n = α0,n0

H

− → α1,n1

H

− → α2,n2

H

− → ··· H − → αk,nk with α0 > α1 > α2 > ··· until eventually αk = 0 and nk = Hα(n) % tail-recursion!! Below we compute fast-growing functions and their inverses by encoding α,n H − → α′,n′ and α′,n′ H − →−1 α,n

14/20

slide-15
SLIDE 15

MH: A LCM WEAKLY COMPUTING

H

− → FOR α < ωω

Write α in CNF with coefficients α = ωm.am + ωm−1.am−1 + ··· + ω0a0. Encoding of α is [am,...,a0] ∈ Nm+1. [am,...,a0 + 1],n H − → [am,...,a0],n + 1 %Hα+1(n) = Hα(n + 1) [am,...,ak + 1,0,0,...,0],n H − → [am,...,ak,n + 1,0,...,0],n %Hλ(n) = Hλn(n) Recall (γ + ωk+1)n = γ + ωk · (n + 1)

15/20

slide-16
SLIDE 16

MH−1: A LCM WEAKLY COMPUTING

H

− →−1 FOR α < ωω

[am,...,a0],n + 1 H − →−1 [am,...,a0 + 1],n %Hα+1(n) = Hα(n + 1) [am,...,ak,n + 1,...,0],n H − →−1 [am,...,ak + 1,0,...,0],n %Hλ(n) = Hλn(n)

  • Prop. [Robustness] a a′ and n n′ imply H[a](n) H[a′](n′)

16/20

slide-17
SLIDE 17

COUNTER MACHINES ON A BUDGET

Ensures:

  • 1. Mb ⊢ (ℓ,B,a) ∗

− →rel (ℓ,B′,a′) implies B + |a| = B′ + |a′|

  • 2. Mb ⊢ (ℓ,B,a) ∗

− →rel (ℓ,B′,a′) implies M ⊢ (ℓ,a) ∗ − →rel (ℓ′,a′)

  • 3. If M ⊢ (ℓ,a) ∗

− →rel (ℓ,a′) then ∃B,B′: Mb ⊢ (ℓ,B,a) ∗ − →rel (ℓ′,B′,a′)

  • 4. If Mb ⊢ (ℓ,B,a) ∗

− → (ℓ,B′,a′) then Mb ⊢ (ℓ,B,a) ∗ − →rel (ℓ,B′,a′) iff B + |a| = B′ + |a′|

17/20

slide-18
SLIDE 18

M(m): WRAPPING IT UP

  • Prop. M(m) has a lossy run

(ℓH,am :1,0,...,n :m,0,...) ∗ − → (ℓH−1,1,0,...,m,0,...) iff M(m) has a reliable run (ℓH,am : 1,0,...,n : m,0,...) ∗ − →rel (ℓH−1,am : 1,0,...,n : m,0,...) iff M has a reliable run from ℓini to ℓfin that is bounded by Hωm(m), i.e., by Ackermann(m)

  • Cor. LCM verification is Ackermann-complete

18/20

slide-19
SLIDE 19

CONCLUSION

Length of bad sequences is key to bounding the complexity of WQO-based algorithms Here verification people have a lot to learn from proof-theory and combinatorics Proving matching lower bounds is not necessarily tricky (and is easy for LCM’s or Reset Petri nets) but we still lack: — a collection of hard problems: Post Embedding Problem, . . . — a tutorial/textbook on subrecursive hierarchies (like fast-growing and Hardy hierarchies) — a toolkit of coding tricks and lemmas for ordinals The approach seems workable: recently we could characterize the complexity of Timed-Arc Petri nets and Data Petri Nets at Fωωω

19/20

slide-20
SLIDE 20

BIBLIOGRAPHICAL POINTERS

Finkel & S., Theor.Comp.Sci. 2001: well-structured transition systems Baier, Bertrand & S., LPAR 2006: more on well-structured transition systems (games, probabilities, ..) Figueira, Figueira, Schmitz & S., LICS 2011: length of bad sequences

  • ver Nk

Schmitz & S., ICALP 2011: compositional length of bad sequences S., MFCS 2010: hardness for LCM’s and related models S., RP 2010: decidability for LCM’s Chambart & S., LICS 2008: hardness for LCS’s (lossy fifo channels) Haddad, Schmitz & S., LICS 2012: hardness for Data nets and Timed-arc Petri nets Chambart & S., ICALP 2010: Post Embedding Problem

20/20