Recognizable Series on Hypergraphs el Bailly 1 cois Denis 2 Guillaume - - PowerPoint PPT Presentation

recognizable series on hypergraphs
SMART_READER_LITE
LIVE PREVIEW

Recognizable Series on Hypergraphs el Bailly 1 cois Denis 2 Guillaume - - PowerPoint PPT Presentation

Recognizable Series on Hypergraphs el Bailly 1 cois Denis 2 Guillaume Rabusseau 2 Rapha Fran 1 Universit e de Technologie de Compi` egne 2 LIF, CNRS, Aix-Marseille Universit e LATA2015 March 5, 2015 Bailly, Denis, Rabusseau (UTC -


slide-1
SLIDE 1

Recognizable Series on Hypergraphs

Rapha¨ el Bailly 1 Fran¸ cois Denis 2 Guillaume Rabusseau 2

1Universit´

e de Technologie de Compi` egne

2LIF, CNRS, Aix-Marseille Universit´

e

LATA’2015

March 5, 2015

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 1 / 18

slide-2
SLIDE 2

Outline

1

Objective and Method

2

Graph Weighted Model

3

Main Results

4

Towards Learning GWMs

5

Conclusion

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 2 / 18

slide-3
SLIDE 3

Objective and Method

Grammatical Inference: estimate probability distributions on string/trees from samples ֒ → Lot of works rely on the notion of recognizable/rational series:

slide-4
SLIDE 4

Objective and Method

Grammatical Inference: estimate probability distributions on string/trees from samples ֒ → Lot of works rely on the notion of recognizable/rational series: A string series r : Σ∗ → R is recognizable ⇔ There exists a finite weighted automaton computing r

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 3 / 18

slide-5
SLIDE 5

Objective and Method

Grammatical Inference: estimate probability distributions on string/trees from samples ֒ → Lot of works rely on the notion of recognizable/rational series: A string series r : Σ∗ → R is recognizable ⇔ There exists a finite weighted automaton computing r ⇔ r has a linear representation ι ∈ Rd, τ ∈ Rd, {Mσ ∈ Rd×d}σ∈Σ r(w) = ι⊤Mw1Mw2 · · · Mwnτ for all w ∈ Σ∗

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 3 / 18

slide-6
SLIDE 6

Objective and Method

Grammatical Inference: estimate probability distributions on string/trees from samples ֒ → Lot of works rely on the notion of recognizable/rational series: A string series r : Σ∗ → R is recognizable ⇔ There exists a finite weighted automaton computing r ⇔ r has a linear representation ι ∈ Rd, τ ∈ Rd, {Mσ ∈ Rd×d}σ∈Σ r(w) = ι⊤Mw1Mw2 · · · Mwnτ for all w ∈ Σ∗

Objective

Extend the notion of recognizable series to graphs and hypergraphs. ֒ → by directly aiming for an algebraic characterization similar to linear representations of string/tree series.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 3 / 18

slide-7
SLIDE 7

Outline

1

Objective and Method

2

Graph Weighted Model

3

Main Results

4

Towards Learning GWMs

5

Conclusion

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 4 / 18

slide-8
SLIDE 8

Graphs

A graph G = (V , E, ℓ) on a ranked alphabet F = (Σ, ♯) Vertices V , Labeling function ℓ : V → Σ, Set of ports P = {(v, j) : v ∈ V , 1 ≤ j ≤ ♯ℓ(v)}, Edges E ⊂ P × P (partition of P) .

f

1 2

f

1 2

g

1 2 3

a

1

Figure : A graph on the ranked alphabet F = {a(·), f (·, ·), g(·, ·, ·)}. V = {1, 2, 3, 4}, ℓ(1) = l(2) = f , ℓ(3) = g, ℓ(4) = a, E =

  • {(1, 1), (3, 2)}, {(1, 2), (2, 1)}, {(2, 2), (3, 1)}, {(3, 3), (4, 1)}
  • Bailly, Denis, Rabusseau (UTC - LIF)

Recognizable Series on Hypergraphs March 5, 2015 5 / 18

slide-9
SLIDE 9

Tensors

Tensor T ∈ k Rd = Rd ⊗ · · · ⊗ Rd ≃ Multi-array (Ti1...ik) ∈ Rd×···×d. Let e1, . . . , ed be the canonical basis of V = Rd, T can be expressed as T =

  • i1,...,ik∈[d]

Ti1...ikei1 ⊗ · · · ⊗ eik k = 1: vector vi (1 ≤ i ≤ d) k = 2: matrix Mi1i2 (1 ≤ i1, i2 ≤ d) k = 3: higher order tensor Ti1i2i3 (1 ≤ i1, i2, i3 ≤ d)

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 6 / 18

slide-10
SLIDE 10

Graph Weighted Models (GWM)

A graph G on the ranked alphabet F = {g(·, ·, ·), f (·, ·), a(·)}:

f

1 2

f

1 2

g

1 2 3

a

1

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 7 / 18

slide-11
SLIDE 11

Graph Weighted Models (GWM)

A graph G on the ranked alphabet F = {g(·, ·, ·), f (·, ·), a(·)}:

f

1 2

f

1 2

g

1 2 3

a

1

Graph Weighted Model: d, {Tx ∈ #x Rd}x∈F.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 7 / 18

slide-12
SLIDE 12

Graph Weighted Models (GWM)

A graph G on the ranked alphabet F = {g(·, ·, ·), f (·, ·), a(·)}:

f

1 2

f

1 2

g

1 2 3

a

1

Graph Weighted Model: d, {Tx ∈ #x Rd}x∈F. Computation of a GWM:

1 Tensor product of all tensors associated to vertices in G:

Tf

i1i2Tf i3i4Tg i5i6i7Ta i8

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 7 / 18

slide-13
SLIDE 13

Graph Weighted Models (GWM)

A graph G on the ranked alphabet F = {g(·, ·, ·), f (·, ·), a(·)}:

f

1 2

f

1 2

g

1 2 3

a

1

Graph Weighted Model: d, {Tx ∈ #x Rd}x∈F. Computation of a GWM:

1 Tensor product of all tensors associated to vertices in G:

Tf

i1i2Tf i3i4Tg i5i6i7Ta i8

2 Contractions directed by the edges of G:

  • i1

Tf

i1i2Tf i3i4Tg i5i1i7Ta i8

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 7 / 18

slide-14
SLIDE 14

Graph Weighted Models (GWM)

A graph G on the ranked alphabet F = {g(·, ·, ·), f (·, ·), a(·)}:

f

1 2

f

1 2

g

1 2 3

a

1

Graph Weighted Model: d, {Tx ∈ #x Rd}x∈F. Computation of a GWM:

1 Tensor product of all tensors associated to vertices in G:

Tf

i1i2Tf i3i4Tg i5i6i7Ta i8

2 Contractions directed by the edges of G:

  • i1i2

Tf

i1i2Tf i2i4Tg i5i1i7Ta i8

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 7 / 18

slide-15
SLIDE 15

Graph Weighted Models (GWM)

A graph G on the ranked alphabet F = {g(·, ·, ·), f (·, ·), a(·)}:

f

1 2

f

1 2

g

1 2 3

a

1

Graph Weighted Model: d, {Tx ∈ #x Rd}x∈F. Computation of a GWM:

1 Tensor product of all tensors associated to vertices in G:

Tf

i1i2Tf i3i4Tg i5i6i7Ta i8

2 Contractions directed by the edges of G:

  • i1i2i4

Tf

i1i2Tf i2i4Tg i4i1i7Ta i8

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 7 / 18

slide-16
SLIDE 16

Graph Weighted Models (GWM)

A graph G on the ranked alphabet F = {g(·, ·, ·), f (·, ·), a(·)}:

f

1 2

f

1 2

g

1 2 3

a

1

Graph Weighted Model: d, {Tx ∈ #x Rd}x∈F. Computation of a GWM:

1 Tensor product of all tensors associated to vertices in G:

Tf

i1i2Tf i3i4Tg i5i6i7Ta i8

2 Contractions directed by the edges of G:

  • i1i2i4i7

Tf

i1i2Tf i2i4Tg i4i1i7Ta i7

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 7 / 18

slide-17
SLIDE 17

Graph Weighted Models (GWM)

A graph G on the ranked alphabet F = {g(·, ·, ·), f (·, ·), a(·)}:

f

1 2

f

1 2

g

1 2 3

a

1

Graph Weighted Model: d, {Tx ∈ #x Rd}x∈F. Computation of a GWM:

1 Tensor product of all tensors associated to vertices in G:

Tf

i1i2Tf i3i4Tg i5i6i7Ta i8

2 Contractions directed by the edges of G:

r(G) =

  • i1i2i4i7

Tf

i1i2Tf i2i4Tg i4i1i7Ta i7

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 7 / 18

slide-18
SLIDE 18

GWM: Examples

F = {ι(·), τ(·), a(·, ·), b(·, ·), c(·, ·)}, GWM {ι, Ma, Mb, Mc, τ} G =

ι

1

a

1 2

b

1 2

c

1 2

τ

1

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 8 / 18

slide-19
SLIDE 19

GWM: Examples

F = {ι(·), τ(·), a(·, ·), b(·, ·), c(·, ·)}, GWM {ι, Ma, Mb, Mc, τ} G =

ι

1

a

1 2

b

1 2

c

1 2

τ

1

1 ιi1Ma

i2i3Mb i4i5Mc i6i7τ i8

2 r(G) =

  • i1i3i5i7

ιi1Ma

i1i3Mb i3i5Mc i5i7τ i7 = ι⊤MaMbMcτ

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 8 / 18

slide-20
SLIDE 20

GWM: Examples

F = {ι(·), τ(·), a(·, ·), b(·, ·), c(·, ·)}, GWM {ι, Ma, Mb, Mc, τ} G =

ι

1

a

1 2

b

1 2

c

1 2

τ

1

1 ιi1Ma

i2i3Mb i4i5Mc i6i7τ i8

2 r(G) =

  • i1i3i5i7

ιi1Ma

i1i3Mb i3i5Mc i5i7τ i7 = ι⊤MaMbMcτ

F = {a(·, ·), b(·, ·), c(·, ·)}, G =

a

1 2

b

1 2

c

2 1

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 8 / 18

slide-21
SLIDE 21

GWM: Examples

F = {ι(·), τ(·), a(·, ·), b(·, ·), c(·, ·)}, GWM {ι, Ma, Mb, Mc, τ} G =

ι

1

a

1 2

b

1 2

c

1 2

τ

1

1 ιi1Ma

i2i3Mb i4i5Mc i6i7τ i8

2 r(G) =

  • i1i3i5i7

ιi1Ma

i1i3Mb i3i5Mc i5i7τ i7 = ι⊤MaMbMcτ

F = {a(·, ·), b(·, ·), c(·, ·)}, G =

a

1 2

b

1 2

c

2 1

1 Ma

i1i2Mb i3i4Mc i5i6

2 r(G) =

  • i2i4i6

Ma

i6i2Mb i2i4Mc i4i6 = Tr(MaMbMc)

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 8 / 18

slide-22
SLIDE 22

Recognizable graph series

A series r : GF → R is recognizable iff it can be computed by a GWM.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 9 / 18

slide-23
SLIDE 23

Recognizable graph series

A series r : GF → R is recognizable iff it can be computed by a GWM. Beyond strings: circular strings, 2D words/pictures...

a

1 2

b

1 2

a

2 1

a

2 1

b

2 1

a

4 3 1 2

c

4 3 1 2

d

4 3 1 2

b

4 3 1 2

α α α α α α α α

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 9 / 18

slide-24
SLIDE 24

Recognizable graph series

A series r : GF → R is recognizable iff it can be computed by a GWM. Crosswords: Let rh and rv be two recognizable string series on Σ∗

a

4 3 1 2

c

4 3 1 2

d

4 3 1 2

b

4 3 1 2

α α α α α α α α

→ rh(ac)rh(db)rv(ad)rv(cb)

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 9 / 18

slide-25
SLIDE 25

Outline

1

Objective and Method

2

Graph Weighted Model

3

Main Results

4

Towards Learning GWMs

5

Conclusion

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 10 / 18

slide-26
SLIDE 26

Main results (1)

Proposition

GWMs are a direct generalization of linear representation of string/tree series.

Proposition

The sum of two recognizable series is recognizable The Hadamard product of two recognizable series is recognizable A main question: Are series with finite support recognizable?

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 11 / 18

slide-27
SLIDE 27

Recognizability of Finite Support Series

Given a graph G, is there a GWM s.t. r(G) = 1 if G = G and 0

  • therwise?

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 12 / 18

slide-28
SLIDE 28

Recognizability of Finite Support Series

Given a graph G, is there a GWM s.t. r(G) = 1 if G = G and 0

  • therwise?

Simple counter-example:

◮ Circular strings on F = {a(·, ·)}, GWM r : d, {Ma ∈ Rd×d}. ◮ r(Gan) = Tr(Mn

a) for all n.

◮ If

G = Ga, we want Tr(Ma) = 1 and Tr(Mn

a) = 0 for all n ≥ 2.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 12 / 18

slide-29
SLIDE 29

Recognizability of Finite Support Series

Given a graph G, is there a GWM s.t. r(G) = 1 if G = G and 0

  • therwise?

Simple counter-example:

◮ Circular strings on F = {a(·, ·)}, GWM r : d, {Ma ∈ Rd×d}. ◮ r(Gan) = Tr(Mn

a) for all n.

◮ If

G = Ga, we want Tr(Ma) = 1 and Tr(Mn

a) = 0 for all n ≥ 2.

Lemma

Let M ∈ Rd×d. If Tr(Mn) = 0 for all n ≥ 2, then Tr(M) = 0.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 12 / 18

slide-30
SLIDE 30

Tilings

f

1 2

f

1 2

g

1 2 3

a

1

Figure : A graph G

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 13 / 18

slide-31
SLIDE 31

Tilings

f

1 2

f

1 2

g

1 2 3

a

1

f

1 2

f

1 2

g

1 2 3

a

1

f

1 2

f

1 2

g

1 2 3

a

1

Figure : Graph G2 with 3 connected components isomorphic to G.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 13 / 18

slide-32
SLIDE 32

Tilings

f

1 2

f

1 2

g

1 2 3

a

1

f

1 2

f

1 2

g

1 2 3

a

1

f

1 2

f

1 2

g

1 2 3

a

1

Figure : Graph G3. Tiling made of three copies of the graph G.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 13 / 18

slide-33
SLIDE 33

Tilings

f

1 2

f

1 2

g

1 2 3

a

1

f

1 2

f

1 2

g

1 2 3

a

1

f

1 2

f

1 2

g

1 2 3

a

1

Figure : Graph G3. Tiling made of three copies of the graph G.

For any graph G, if r( G) = 0 then there exists a tiling G of G s.t. r(G) = 0.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 13 / 18

slide-34
SLIDE 34

Main results (2)

Theorem

Given a graph G, there exists a recognizable series r such that r(G) = 0 if and only if G is a tiling of G.

Corollary

For any family of graph which does not allow tilings, graph series with finite support are recognizable. Family of rooted graphs over F: there exists a0 ∈ Σ s.t. for any G ∈ F, there exists exactly one vertex v ∈ VG such that ℓ(v) = a0.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 14 / 18

slide-35
SLIDE 35

Outline

1

Objective and Method

2

Graph Weighted Model

3

Main Results

4

Towards Learning GWMs

5

Conclusion

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 15 / 18

slide-36
SLIDE 36

Ongoing Work: Learning GWMs

Let r : d, {Tx ∈ #x Rd}x∈F be a GWM. Given (G1, r(G1)), (G2, r(G2)), · · · , can we recover the tensors {Tx ∈ #x Rd}x∈F?

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 16 / 18

slide-37
SLIDE 37

Ongoing Work: Learning GWMs

Let r : d, {Tx ∈ #x Rd}x∈F be a GWM. Given (G1, r(G1)), (G2, r(G2)), · · · , can we recover the tensors {Tx ∈ #x Rd}x∈F? Spectral learning for recognizable series on strings.

◮ Low-rank factorization of Hankel matrix H ∈ RΣ∗×Σ∗, Hu,v = r(uv). Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 16 / 18

slide-38
SLIDE 38

Ongoing Work: Learning GWMs

Let r : d, {Tx ∈ #x Rd}x∈F be a GWM. Given (G1, r(G1)), (G2, r(G2)), · · · , can we recover the tensors {Tx ∈ #x Rd}x∈F? Spectral learning for recognizable series on strings.

◮ Low-rank factorization of Hankel matrix H ∈ RΣ∗×Σ∗, Hu,v = r(uv).

Learning GWMs

◮ Graph cuts: ◮ Hankel Matrices/Tensors in RGF,2×GF,2, RF1×GF,1, RF2×F1×GF,3, ...

→ Preliminary results show that low-rank factorizations of the Hankel tensors can be used to recover the GWM parameters (circular strings and 2D-words).

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 16 / 18

slide-39
SLIDE 39

Outline

1

Objective and Method

2

Graph Weighted Model

3

Main Results

4

Towards Learning GWMs

5

Conclusion

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 17 / 18

slide-40
SLIDE 40

Conclusion

We proposed a definition of recognizable series on graphs (and hypergraphs). Direct generalization of recognizable series on strings and trees. Characterization of the recognizability of finite support series.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 18 / 18

slide-41
SLIDE 41

Conclusion

We proposed a definition of recognizable series on graphs (and hypergraphs). Direct generalization of recognizable series on strings and trees. Characterization of the recognizability of finite support series. Generalization of the spectral method for hypergraph recognizable series? A bridge between graphical models and recognizable series? Algorithms to compute/approximate/learn (e.g. message passing).

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 18 / 18

slide-42
SLIDE 42

Conclusion

We proposed a definition of recognizable series on graphs (and hypergraphs). Direct generalization of recognizable series on strings and trees. Characterization of the recognizability of finite support series. Generalization of the spectral method for hypergraph recognizable series? A bridge between graphical models and recognizable series? Algorithms to compute/approximate/learn (e.g. message passing).

Thank you for your attention.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 18 / 18

slide-43
SLIDE 43

Hypergraphs

a1

1 2 3

b2

1 2

a3

3 2 1

h1 h2 h3 h4

F =

  • (a, 3), (b, 2)
  • V =
  • 1, 2, 3
  • ℓ(1) = ℓ(3) = a, ℓ(2) = b

A hypergraph G = (V , E, ℓ) on a ranked alphabet F = (Σ, ♯) V set of vertices, ℓ : V → Σ labeling function, P = {(v, j) : v ∈ V , 1 ≤ j ≤ ♯ℓ(v)} set of ports of G, E = (hk)1≤k≤nE a partition of P set of hyper-edges of G.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 18 / 18

slide-44
SLIDE 44

Hypergraphs

a1

1 2 3

b2

1 2

a3

3 2 1

h1 h2 h3 h4

P =

  • (1, 1), (1, 2), (1, 3),

(2, 1), (2, 2), (3, 1), (3, 2), (3, 3)

  • A hypergraph G = (V , E, ℓ) on a ranked alphabet F = (Σ, ♯)

V set of vertices, ℓ : V → Σ labeling function, P = {(v, j) : v ∈ V , 1 ≤ j ≤ ♯ℓ(v)} set of ports of G, E = (hk)1≤k≤nE a partition of P set of hyper-edges of G.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 18 / 18

slide-45
SLIDE 45

Hypergraphs

a1

1 2 3

b2

1 2

a3

3 2 1

h1 h2 h3 h4

E =

  • h1 :
  • (1, 1), (3, 3)
  • , h2 :
  • (1, 2), (2, 1), (3, 2)
  • , h3 :
  • (1, 3), (2, 2)
  • , h4 :
  • (3, 1)
  • A hypergraph G = (V , E, ℓ) on a ranked alphabet F = (Σ, ♯)

V set of vertices, ℓ : V → Σ labeling function, P = {(v, j) : v ∈ V , 1 ≤ j ≤ ♯ℓ(v)} set of ports of G, E = (hk)1≤k≤nE a partition of P set of hyper-edges of G.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 18 / 18

slide-46
SLIDE 46

Hypergraph Weighted Model

F, d, {Tx}x∈Σ, ⊙, β where F is a ranked alphabet, d ∈ N+ is the dimension of the representation, V = Rd, Tx ∈ V ⊗♯x, tensor associated with symbol x ⊙ : V × V → V is a symmetric associative product β is a linear form on V . Example of reduction operators: ⊙id is defined by ei ⊙id ej = δijei, β1 is defined by β1(ei) = 1 for 1 ≤ i ≤ d.

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 18 / 18

slide-47
SLIDE 47

HWM: Computation

a1

1 2 3

b2

1 2

a3

3 2 1

h1 h2 h3 h4

HWM F, d, {A, B}, ⊙, β

1

Tensor product of all tensors associated to vertices

  • i1···i8

Ai1i2i3Ai4i5i6Bi7,i8ei1 ⊗ · · · ⊗ ei8

2

Reduction with ⊙ directed by the hyperedges

  • i1···i8

Ai1i2i3Ai4i5i6Bi7,i8(ei1 ⊙ ei6) ⊗ (ei2 ⊙ ei5 ⊙ ei7) ⊗ (ei3 ⊙ ei8) ⊗ ei4

3

Contraction with β

  • i1···i8

Ai1i2i3Ai4i5i6Bi7,i8β⊤(ei1 ⊙ ei6)β⊤(ei2 ⊙ ei5 ⊙ ei7)β⊤(ei3 ⊙ ei8)β⊤ei4

Bailly, Denis, Rabusseau (UTC - LIF) Recognizable Series on Hypergraphs March 5, 2015 18 / 18