Transformation of Turing Machines into Context-Dependent Fusion - - PowerPoint PPT Presentation

transformation of turing machines into context dependent
SMART_READER_LITE
LIVE PREVIEW

Transformation of Turing Machines into Context-Dependent Fusion - - PowerPoint PPT Presentation

Transformation of Turing Machines into Context-Dependent Fusion Grammars Aaron Lye University of Bremen lye@math.uni-bremen.de 17.07.2019 10th International Workshop on Graph Computation Models (GCM) 1/21 Motivation Fusion grammars were


slide-1
SLIDE 1

1/21

Transformation of Turing Machines into Context-Dependent Fusion Grammars

Aaron Lye

University of Bremen lye@math.uni-bremen.de

17.07.2019 10th International Workshop on Graph Computation Models (GCM)

slide-2
SLIDE 2

2/21

Motivation

Fusion grammars were introduced at ICGT 2017 as a generating device for hypergraph languages. Context-dependent fusion grammars were introduced at LATA 2019 to simulate Petri nets.

slide-3
SLIDE 3

2/21

Motivation

Fusion grammars were introduced at ICGT 2017 as a generating device for hypergraph languages. Context-dependent fusion grammars were introduced at LATA 2019 to simulate Petri nets. How powerful are context-dependent fusion grammars?

slide-4
SLIDE 4

2/21

Motivation

Fusion grammars were introduced at ICGT 2017 as a generating device for hypergraph languages. Context-dependent fusion grammars were introduced at LATA 2019 to simulate Petri nets. How powerful are context-dependent fusion grammars? ◮ They are at least as powerful as Petri nets.

slide-5
SLIDE 5

2/21

Motivation

Fusion grammars were introduced at ICGT 2017 as a generating device for hypergraph languages. Context-dependent fusion grammars were introduced at LATA 2019 to simulate Petri nets. How powerful are context-dependent fusion grammars? ◮ They are at least as powerful as Petri nets. ◮ They are powerful enough to simulate Turing machines.

slide-6
SLIDE 6

2/21

Motivation

Fusion grammars were introduced at ICGT 2017 as a generating device for hypergraph languages. Context-dependent fusion grammars were introduced at LATA 2019 to simulate Petri nets. How powerful are context-dependent fusion grammars? ◮ They are at least as powerful as Petri nets. ◮ They are powerful enough to simulate Turing machines. ◮ They can generate all recursively enumerable string languages (up to representation) and are universal in this respect.

slide-7
SLIDE 7

3/21

Hypergraph

We consider hypergraphs over Σ with hyperedges like vk1 . . . v1 A wk2 . . . w1 k1 1 k2 1 where v1 · · · vk1 is a sequence of source vertices w1 · · · wk2 is a sequence of target vertices A ∈ Σ is a label. The class of all hypergraphs over Σ is denoted by HΣ.

slide-8
SLIDE 8

4/21

Fusion rule

Let F ⊆ Σ be a fusion alphabet. Let type : F → N × N. Each A ∈ F has a complement A ∈ F where type(A) = type(A). fr(A) = vk1 . . . v1 v′

1 . . .

v′

k1

A A wk2 . . . w1 w′

1

. . . w′

k2

k1 1 k2 1 k1 1 k2 1 type(A) = (k1, k2)

slide-9
SLIDE 9

5/21

Rule application

  • 1. find a matching morphism g of fr(A) in the hypergraph H.

H vk1 . . . v1 v′

1 . . .

v′

k1

A A wk2 . . . w1 w′

1

. . . w′

k2

k1 1 k2 1 k1 1 k2 1

slide-10
SLIDE 10

5/21

Rule application

  • 1. find a matching morphism g of fr(A) in the hypergraph H.
  • 2. remove the images of the two hyperedges of fr(A).

I vk1 . . . v1 v′

1 . . .

v′

k1

wk2 . . . w1 w′

1

. . . w′

k2

slide-11
SLIDE 11

5/21

Rule application

  • 1. find a matching morphism g of fr(A) in the hypergraph H.
  • 2. remove the images of the two hyperedges of fr(A).
  • 3. identify corresponding source and target vertices of the

removed hyperedges. H′ vk1 = v′

k1

. . . v1 = v′

1

wk2 = w′

k2

. . . w1 = w′

1

slide-12
SLIDE 12

5/21

Rule application

  • 1. find a matching morphism g of fr(A) in the hypergraph H.
  • 2. remove the images of the two hyperedges of fr(A).
  • 3. identify corresponding source and target vertices of the

removed hyperedges. H′ vk1 = v′

k1

. . . v1 = v′

1

wk2 = w′

k2

. . . w1 = w′

1

slide-13
SLIDE 13

5/21

Rule application

  • 1. find a matching morphism g of fr(A) in the hypergraph H.
  • 2. remove the images of the two hyperedges of fr(A).
  • 3. identify corresponding source and target vertices of the

removed hyperedges. H′ vk1 = v′

k1

. . . v1 = v′

1

wk2 = w′

k2

. . . w1 = w′

1

Rule application is denoted by H = ⇒

fr(A) H′.

slide-14
SLIDE 14

6/21

Context-dependent fusion rule and its application

(fr(A), PC, NC) where PC, NC are sets of morphisms with domain fr(A). It is applicable to hypergraph H via a matching morphism g : fr(A) → H if

  • 1. ∀c ∈ PC

C fr(A) H = c g ∃h and h is injective on the set

  • f hyperedges.
  • 2. ∀c ∈ NC

C fr(A) H = c g ∃h Adaption of the definition in Habel-Heckel-Taentzer:96.

slide-15
SLIDE 15

7/21

Multiplication

Fusion adds nothing to the hypergraph. For a finite hypergraph the number of possible fusions is finite. One may multiply (copy) connected components.

slide-16
SLIDE 16

7/21

Multiplication

Fusion adds nothing to the hypergraph. For a finite hypergraph the number of possible fusions is finite. One may multiply (copy) connected components. Let C(H) be the set of all connected components of H. Define multiplicity m: C(H) → N. H = ⇒

m m · H =

  • C∈C(H)

m(C) · C

slide-17
SLIDE 17

8/21

Context-dependent fusion grammar CDFG = (Z, F, M, T, P)

Z: finite start hypergraph F, M, T ⊆ Σ: fusion, marker, terminal alphabet (all finite and pairwise disjoint) P: finite set of context-dependent fusion rules.

slide-18
SLIDE 18

8/21

Context-dependent fusion grammar CDFG = (Z, F, M, T, P)

Z: finite start hypergraph F, M, T ⊆ Σ: fusion, marker, terminal alphabet (all finite and pairwise disjoint) P: finite set of context-dependent fusion rules. A direct derivation is either H = ⇒

cdfr H′

for some cdfr ∈ P or H = ⇒

m m · H = C∈C(H)

m(C) · C for some multiplicity m: C(H) → N.

slide-19
SLIDE 19

8/21

Context-dependent fusion grammar CDFG = (Z, F, M, T, P)

Z: finite start hypergraph F, M, T ⊆ Σ: fusion, marker, terminal alphabet (all finite and pairwise disjoint) P: finite set of context-dependent fusion rules. A direct derivation is either H = ⇒

cdfr H′

for some cdfr ∈ P or H = ⇒

m m · H = C∈C(H)

m(C) · C for some multiplicity m: C(H) → N. Derivations are defined by the reflexive and transitive closure.

slide-20
SLIDE 20

8/21

Context-dependent fusion grammar CDFG = (Z, F, M, T, P)

Z: finite start hypergraph F, M, T ⊆ Σ: fusion, marker, terminal alphabet (all finite and pairwise disjoint) P: finite set of context-dependent fusion rules. A direct derivation is either H = ⇒

cdfr H′

for some cdfr ∈ P or H = ⇒

m m · H = C∈C(H)

m(C) · C for some multiplicity m: C(H) → N. Derivations are defined by the reflexive and transitive closure. The generated language L(CDFG) = {remM(Y ) | Z

= ⇒ H, Y ∈ C(H) ∩ (HT∪M − HT)}, where remM(Y ) removes all marker hyperedges from Y .

slide-21
SLIDE 21

9/21

Transformation of Turing Machines into Context-Dependent Fusion Grammars

slide-22
SLIDE 22

10/21

Turing machine TM = (Q, Ω, Γ, ∆)

Q: finite set of states, qstart, qaccept ∈ Q, qstart = qaccept Ω: input alphabet Γ: tape alphabet with Ω ⊆ Γ and ∈ Γ \ Ω ∆: transition relation ∆ ⊆ (Q \ {qaccept}) × Q × Γ × Γ × {l, n, r} state transition, symbol replacement, head movement

slide-23
SLIDE 23

10/21

Turing machine TM = (Q, Ω, Γ, ∆)

Q: finite set of states, qstart, qaccept ∈ Q, qstart = qaccept Ω: input alphabet Γ: tape alphabet with Ω ⊆ Γ and ∈ Γ \ Ω ∆: transition relation ∆ ⊆ (Q \ {qaccept}) × Q × Γ × Γ × {l, n, r} state transition, symbol replacement, head movement It has one two-sided infinite tape.

slide-24
SLIDE 24

10/21

Turing machine TM = (Q, Ω, Γ, ∆)

Q: finite set of states, qstart, qaccept ∈ Q, qstart = qaccept Ω: input alphabet Γ: tape alphabet with Ω ⊆ Γ and ∈ Γ \ Ω ∆: transition relation ∆ ⊆ (Q \ {qaccept}) × Q × Γ × Γ × {l, n, r} state transition, symbol replacement, head movement It has one two-sided infinite tape. Q × ∞Γ∗ × Γ∗∞: configurations with the usual transition step c ⊢TM c′ based on ∆.

slide-25
SLIDE 25

10/21

Turing machine TM = (Q, Ω, Γ, ∆)

Q: finite set of states, qstart, qaccept ∈ Q, qstart = qaccept Ω: input alphabet Γ: tape alphabet with Ω ⊆ Γ and ∈ Γ \ Ω ∆: transition relation ∆ ⊆ (Q \ {qaccept}) × Q × Γ × Γ × {l, n, r} state transition, symbol replacement, head movement It has one two-sided infinite tape. Q × ∞Γ∗ × Γ∗∞: configurations with the usual transition step c ⊢TM c′ based on ∆. The recognized language: w ∈ Ω∗ contributes to L(TM) if and only if qaccept is reachable from start configuration (qstart, ∞, w∞).

slide-26
SLIDE 26

11/21

Transformation of Turing Machines into Context-Dependent Fusion Grammars

Main construction steps:

  • 1. Representation of the TM by a hypergraph.
  • 2. Generation of arbitrary inputs on the tape.
  • 3. Simulation of a transition step of the TM.
slide-27
SLIDE 27

11/21

Transformation of Turing Machines into Context-Dependent Fusion Grammars

Main construction steps:

  • 1. Representation of the TM by a hypergraph.
  • 2. Generation of arbitrary inputs on the tape.
  • 3. Simulation of a transition step of the TM.

(Context-dependent) fusion rules can only consume two complementary labeled hyperedges by a rule application. All modifications must be expressed in this way.

slide-28
SLIDE 28

12/21

Representation of the Turing machine by a hypergraph

TM is represented by its usual state graph. qaccept qstart qaux b//r b/b/n a/c/r

slide-29
SLIDE 29

12/21

Representation of the Turing machine by a hypergraph

TM is represented by its usual state graph. tape qaccept acc qstart qaux head 1 2 3 b//r b/b/n a/c/r head-hyperedge

  • 1. connects Turing

machine with tape graph,

  • 2. indicates the

current state (first source),

  • 3. points to the

current symbol to be read after tape is fused.

slide-30
SLIDE 30

12/21

Representation of the Turing machine by a hypergraph

TM is represented by its usual state graph. tape-hyperedge enables fusion with tape graph. tape qaccept acc qstart qaux head 1 2 3 b//r b/b/n a/c/r head-hyperedge

  • 1. connects Turing

machine with tape graph,

  • 2. indicates the

current state (first source),

  • 3. points to the

current symbol to be read after tape is fused.

slide-31
SLIDE 31

12/21

Representation of the Turing machine by a hypergraph

TM is represented by its usual state graph. tape-hyperedge enables fusion with tape graph. tape qaccept acc qstart qaux head 1 2 3 b//r b/b/n a/c/r acc-loop indicates accepting state. head-hyperedge

  • 1. connects Turing

machine with tape graph,

  • 2. indicates the

current state (first source),

  • 3. points to the

current symbol to be read after tape is fused.

slide-32
SLIDE 32

13/21

Hypergraphical representation of the tape and its generation

input tape a b working tape · · ·

  • a

b

  • · · ·
slide-33
SLIDE 33

13/21

Hypergraphical representation of the tape and its generation

input tape a b working tape · · ·

  • a

b

  • · · ·
  • corresponds to

tape cut ⊲ µ af a bf b ⊳

slide-34
SLIDE 34

13/21

Hypergraphical representation of the tape and its generation

input tape a b working tape · · ·

  • a

b

  • · · ·
  • corresponds to

tape cut ⊲ µ af a bf b ⊳

fr(gen) 3

tape gen cut ⊲ µ af gen gen a bf gen gen b gen ⊳

slide-35
SLIDE 35

13/21

Hypergraphical representation of the tape and its generation

input tape a b working tape · · ·

  • a

b

  • · · ·
  • corresponds to

tape cut ⊲ µ af a bf b ⊳

fr(gen) 3

tape gen cut ⊲ µ af gen gen a bf gen gen b gen ⊳

cut is used to disconnect the upper part (with marker) from the rest.

slide-36
SLIDE 36

13/21

Hypergraphical representation of the tape and its generation

input tape a b working tape · · ·

  • a

b

  • · · ·
  • corresponds to

tape cut ⊲ µ af a bf b ⊳

fr(gen) 3

tape gen cut ⊲ µ af gen gen a bf gen gen b gen ⊳

cut is used to disconnect the upper part (with marker) from the rest. Extension to the left/right via fr(⊲) and

and fr(⊳) and

.

slide-37
SLIDE 37

13/21

Hypergraphical representation of the tape and its generation

input tape a b working tape · · ·

  • a

b

  • · · ·
  • corresponds to

tape cut ⊲ µ af a bf b ⊳

fr(gen) 3

tape gen cut ⊲ µ af gen gen a bf gen gen b gen ⊳

cut is used to disconnect the upper part (with marker) from the rest. Extension to the left/right via fr(⊲) and

and fr(⊳) and

. By this construction arbitrary tape graphs can be generated.

slide-38
SLIDE 38

14/21

Hypergraph representation of a configuration

tape cut ⊲ µ af a bf b ⊳ tape qaccept acc qstart qaux head 1 2 3 b//r b/b/n a/c/r cut µ a b af bf ⊲ ⊳ qaccept acc qstart qaux head 1 2 3 b//r b/b/n a/c/r

fr(tape)

slide-39
SLIDE 39

15/21

Simulating the transition step for (qstart, qaccept, a, c, r)

df af bf qaccept qstart qaux head 1 2 3 b//r b/b/n a/c/r df cf bf qaccept qstart qaux head 3 2 1 b//r b/b/n a/c/r

∗ analogously for other transition steps

slide-40
SLIDE 40

15/21

Simulating the transition step for (qstart, qaccept, a, c, r)

df af bf qaccept qstart qaux head 1 2 3 b//r b/b/n a/c/r df cf bf qaccept qstart qaux head 3 2 1 b//r b/b/n a/c/r

head 3 head 1 af df cf df

slide-41
SLIDE 41

15/21

Simulating the transition step for (qstart, qaccept, a, c, r)

df af bf qaccept qstart qaux head 1 2 3 b//r b/b/n a/c/r df cf bf qaccept qstart qaux head 3 2 1 b//r b/b/n a/c/r

head 3 head 1 af df cf df

∆(d, a/c/r) ∆(d, a/c/r) wrt fr(head) Positive context

head 1 3 af df a/c/r head 3 head 1 af df cf df

Negative context no two incoming and no two outgoing Γf -edges

slide-42
SLIDE 42

15/21

Simulating the transition step for (qstart, qaccept, a, c, r)

df af bf qaccept qstart qaux head 1 2 3 b//r b/b/n a/c/r df cf bf qaccept qstart qaux head 3 2 1 b//r b/b/n a/c/r

head 3 head 1 af df cf df df af bf qaccept af df cf df qstart qaux head 3 2 1 b//r b/b/n a/c/r

∆(d, a/c/r) ∆(d, a/c/r) wrt fr(head) Positive context:

head 1 3 af df a/c/r head 3 head 1 af df cf df

Negative context: no two incoming and no two outgoing Γf -edges

slide-43
SLIDE 43

15/21

Simulating the transition step for (qstart, qaccept, a, c, r)

df cf bf qaccept qstart qaux head 3 2 1 b//r b/b/n a/c/r df af bf qaccept af df cf df qstart qaux head 3 2 1 b//r b/b/n a/c/r

fuse loop out(a) fuse loop out(a) Positive context:

af zf zf yf zf af

Negative context: ∅

slide-44
SLIDE 44

15/21

Simulating the transition step for (qstart, qaccept, a, c, r)

df cf bf qaccept qstart qaux head 3 2 1 b//r b/b/n a/c/r df af bf qaccept af df cf df qstart qaux head 3 2 1 b//r b/b/n a/c/r df bf qaccept df cf df qstart qaux head 3 2 1 b//r b/b/n a/c/r

fuse loop out(a) fuse loop out(a) Positive context:

af zf zf yf zf af

Negative context: ∅

slide-45
SLIDE 45

15/21

Simulating the transition step for (qstart, qaccept, a, c, r)

df cf bf qaccept qstart qaux head 3 2 1 b//r b/b/n a/c/r

fuse 2in(d)

df af bf qaccept af df cf df qstart qaux head 3 2 1 b//r b/b/n a/c/r df bf qaccept df cf df qstart qaux head 3 2 1 b//r b/b/n a/c/r

fuse loop out(a) fuse loop out(a) Positive context:

af zf zf yf zf af

Negative context: ∅ fuse 2in(d) Positive context:

. . . head df df

Negative context: cf. paper

slide-46
SLIDE 46

16/21

Simulating a transition step

Connected components in start hypergraph: . . . . . . head i head 1 uf xf yf uf

(a) C(u, x/y/l, i)

. . . . . . head i head 1 xf yf

(b) C(u, x/y/n, i)

. . . . . . head i head 1 xf uf yf uf

(c) C(u, x/y/r, i)

u, x, y ∈ Γ, 1 ≤ i ≤ |Q|

slide-47
SLIDE 47

16/21

Simulating a transition step

Connected components in start hypergraph: . . . . . . head i head 1 uf xf yf uf

(a) C(u, x/y/l, i)

. . . . . . head i head 1 xf yf

(b) C(u, x/y/n, i)

. . . . . . head i head 1 xf uf yf uf

(c) C(u, x/y/r, i)

u, x, y ∈ Γ, 1 ≤ i ≤ |Q| P∆: Context-dependent fusion rules: ◮ ∆(u, x/y/dir) for each u ∈ Γ, x/y/dir ∈ Λ where Λ = {x/y/dir | x, y ∈ Γ, dir ∈ {l, n, r}} ◮ fuse 2in(x), fuse 2out(x), fuse loop in(x), fuse loop out(x) for each x ∈ Γ

slide-48
SLIDE 48

17/21

Lemma: One-to-one correspondence of a transition step in TM and a particular derivation sequence in CDFG(TM)

H + C(u, x/y/l, k) = ⇒

∆(u,x/y/l) X

= ⇒

fuse loop in(u) X ′

= ⇒

fuse 2out(x)

H′ H + C(u, x/y/n, k) = ⇒

∆(u,x/y/n) X

= ⇒

fuse 2out(x)

H′ H + C(u, x/y/r, k) = ⇒

∆(u,x/y/r) X

= ⇒

fuse loop out(u) X ′

= ⇒

fuse 2in(x)

H′ H, H′: hypergraph representation of respective configurations wrt same input string

slide-49
SLIDE 49

18/21

Acceptance

cut µ a b df cf bf qaccept acc qstart qaux head 3 2 1 b//r b/b/n a/c/r

slide-50
SLIDE 50

18/21

Acceptance

cut µ a b df cf bf qaccept acc qstart qaux head 3 2 1 b//r b/b/n a/c/r . . . head term cut

accept accept Positive context:

acc . . . head 1 . . . head term cut

Negative context: ∅

slide-51
SLIDE 51

18/21

Acceptance

cut µ a b df cf bf qaccept acc qstart qaux head 3 2 1 b//r b/b/n a/c/r . . . head term cut cut µ cut a b df cf bf qaccept qstart qaux term 3 2 1 b//r b/b/n a/c/r

accept accept Positive context:

acc . . . head 1 . . . head term cut

Negative context: ∅

slide-52
SLIDE 52

18/21

Acceptance

cut µ a b df cf bf qaccept acc qstart qaux head 3 2 1 b//r b/b/n a/c/r . . . head term cut cut µ cut a b df cf bf qaccept qstart qaux term 3 2 1 b//r b/b/n a/c/r

accept accept Positive context:

acc . . . head 1 . . . head term cut

Negative context: ∅ cut Positive context:

cut cut

Negative context: ∅ cut

µ a b df cf bf qaccept acc qstart qaux term 3 2 1 b//r b/b/n a/c/r

slide-53
SLIDE 53

19/21

Main result

Theorem

Let TM be a Turing machine. Let CDFG(TM) be the corresponding context-dependent fusion grammar. L(CDFG(TM)) = {sg(w) | w ∈ L(TM)} generated language recognized language sg(w) graph representation of a string w.

slide-54
SLIDE 54

19/21

Main result

Theorem

Let TM be a Turing machine. Let CDFG(TM) be the corresponding context-dependent fusion grammar. L(CDFG(TM)) = {sg(w) | w ∈ L(TM)} generated language recognized language sg(w) graph representation of a string w.

  • 1. Lemma: One-to-one correspondence of a computation step in

TM and a derivation sequence in CDFG(TM)

  • 2. Lemma: accept and cut only applicable to hypergraph

representation of configurations wrt input w if and only if w ∈ L(TM).

slide-55
SLIDE 55

20/21

Conclusion

Context-dependent fusion grammars are powerful enough to simulate Turing machines. They can generate all recursively enumerable string languages (up to representation) and are universal in this respect. In the literature, one encounters model transformations from several modeling approaches into Turing machines. Now they can be extended to context-dependent fusion grammars. Does this provide interesting insights?

slide-56
SLIDE 56

21/21

Future work

Are only positive or only negative context conditions powerful enough to cover Turing machines? At ICGT 2018 we introduced splicing/fusion grammars enhancing fusion grammars by the inversion of fusions. How does a natural transformation of context-dependent fusion grammars into splicing/fusion grammars or the other way round look like?

slide-57
SLIDE 57

21/21

Future work

Are only positive or only negative context conditions powerful enough to cover Turing machines? At ICGT 2018 we introduced splicing/fusion grammars enhancing fusion grammars by the inversion of fusions. How does a natural transformation of context-dependent fusion grammars into splicing/fusion grammars or the other way round look like? Thank you! Questions?