Expressive Power of Evolving Neural Networks Working on Infinite - - PowerPoint PPT Presentation

expressive power of evolving neural networks working on
SMART_READER_LITE
LIVE PREVIEW

Expressive Power of Evolving Neural Networks Working on Infinite - - PowerPoint PPT Presentation

Introduction RNNs Muller TMs Topology Det. -NNs Nondet. -RNNs Conclusion Expressive Power of Evolving Neural Networks Working on Infinite Input Streams Olivier Finkel 1 emie Cabessa 2 Joint work with J er 1 CNRS and University


slide-1
SLIDE 1

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Expressive Power of Evolving Neural Networks Working on Infinite Input Streams

Olivier Finkel1 Joint work with J´ er´ emie Cabessa2

1 CNRS and University Paris 7 2 Laboratoire d’´

Economie Math´ ematique, Universit´ e Paris 2

FCT 2017, Bordeaux 12 Septembre 2017

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-2
SLIDE 2

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Introduction

◮ The computational capabilities of recurrent neural networks

have mainly been studied in the context of classical computa- tion: McCulloch & Pitts (1943), Turing (1948), Kleene (1956), von Neumann (1958), Minsky (1967), Papert (1969),. . . , Siegel- mann & Sontag (1994-1995),. . .

◮ We provide a characterization of the computational power of

recurrent neural networks in terms of their attractor dynamics, i.e., in the context of infinite input stream computation.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-3
SLIDE 3

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Introduction

◮ The computational capabilities of recurrent neural networks

have mainly been studied in the context of classical computa- tion: McCulloch & Pitts (1943), Turing (1948), Kleene (1956), von Neumann (1958), Minsky (1967), Papert (1969),. . . , Siegel- mann & Sontag (1994-1995),. . .

◮ We provide a characterization of the computational power of

recurrent neural networks in terms of their attractor dynamics, i.e., in the context of infinite input stream computation.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-4
SLIDE 4

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Recurrent Neural Network

w9 w3 w10 w15 w13 w12 w4 w14 w6 w2 w5 w1 w7 w11 w16 w8

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-5
SLIDE 5

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Boolean Neural Networks

ci ai1 ai2 aiN biM bi1

1 1

θ

xi neuron

xi(t + 1) = θ  

N

  • j=1

aij · xj(t) +

M

  • j=1

bij · uj(t) + ci  

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-6
SLIDE 6

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Sigmoidal Neural Networks

ci ai1 ai2 aiN biM bi1

1 1

σ

xi neuron

xi(t + 1) = σ  

N

  • j=1

aij · xj(t) +

M

  • j=1

bij · uj(t) + ci  

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-7
SLIDE 7

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Sigmoidal Neural Networks

ci(t) ai1(t) ai2(t) aiN(t) biM(t) bi1(t)

1 1

σ

xi neuron

xi(t + 1) = σ  

N

  • j=1

aij(t) · xj(t) +

M

  • j=1

bij(t) · uj(t) + ci(t)  

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-8
SLIDE 8

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Neural Networks

We consider three models of NNs:

  • 1. Boolean rational NNs:

B-NN[Q]s

  • 2. Sigmoidal static rational NNs:

St-NN[Q]s

  • 3. Sigmoidal bi-valued evolving rational NNs:

Ev2-NN[Q]s

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-9
SLIDE 9

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Neural Networks

We consider three models of NNs:

  • 1. Boolean rational NNs:

B-NN[Q]s

  • 2. Sigmoidal static rational NNs:

St-NN[Q]s

  • 3. Sigmoidal bi-valued evolving rational NNs:

Ev2-NN[Q]s

ci ai1 ai2 aiN biM bi1

1 1

θ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-10
SLIDE 10

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Neural Networks

We consider three models of NNs:

  • 1. Boolean rational NNs:

B-NN[Q]s

  • 2. Sigmoidal static rational NNs:

St-NN[Q]s

  • 3. Sigmoidal bi-valued evolving rational NNs:

Ev2-NN[Q]s

ci ai1 ai2 aiN biM bi1

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-11
SLIDE 11

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Neural Networks

We consider three models of NNs:

  • 1. Boolean rational NNs:

B-NN[Q]s

  • 2. Sigmoidal static rational NNs:

St-NN[Q]s

  • 3. Sigmoidal bi-valued evolving rational NNs:

Ev2-NN[Q]s

ci(t) ai1(t) ai2(t) aiN(t) biM(t) bi1(t)

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-12
SLIDE 12

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results (Classical Computation)

Boolean Static Bi-valued Evolving FSA TM TM/poly(A) REG P P/poly Kleene 56 Siegelmann & Cabessa & Minsky 67 Sontag 95 Siegelmann 11,14

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-13
SLIDE 13

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Muller Turing Machine

A Muller Turing machine consists of a classical TM with Muller acceptance condition. Muller table T : collection of accepting sets

  • f states.

◮ The ω-word u is accepted by M if there is an infinite run ρu

  • f the machine M on u such that inf(ρu) ∈ T

◮ The ω-language accepted by M is the set of ω-words accepted

by M.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-14
SLIDE 14

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Complexity of ω-languages

The question naturally arises of the complexity of ω-languages accepted by various kinds of automata. A way to study the complexity of ω-languages is to consider their topological complexity.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-15
SLIDE 15

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Topology on Σω

The natural prefix metric on the set Σω of ω-words over Σ is defined as follows: For u, v ∈ Σω and u = v let δ(u, v) = 2−n where n is the least integer such that: the (n + 1)st letter of u is different from the (n + 1)st letter of v. This metric induces on Σω the usual Cantor topology for which :

◮ open subsets of Σω are in the form W.Σω, where W ⊆ Σ⋆. ◮ closed subsets of Σω are complements of open subsets of Σω.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-16
SLIDE 16

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Borel Hierarchy

Σ0

1 is the class of open subsets of Σω,

Π0

1 is the class of closed subsets of Σω,

For any countable ordinal α ≥ 2: Σ0

α is the class of countable unions of subsets of Σω in γ<α Π0 γ.

Π0

α is the class of complements of Σ0 α-sets

∆0

α=Π0 α ∩ Σ0 α.

A set X ⊆ Σω is a Borel set iff it is in

α<ω1 Σ0 α = α<ω1 Π0 α

where ω1 is the first uncountable ordinal.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-17
SLIDE 17

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Borel Hierarchy

Below an arrow → represents a strict inclusion between Borel classes. Π0

1

Π0

α

Π0

α+1

ր ց ր ր ց ր ∆0

1

∆0

2

· · · · · · ∆0

α

∆0

α+1

· · · ց ր ց ց ր ց Σ0

1

Σ0

α

Σ0

α+1

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-18
SLIDE 18

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Beyond the Borel Hierarchy

There are some subsets of Σω which are not Borel. Beyond the Borel hierarchy is the projective hierarchy. The class of Borel subsets of Σω is strictly included in the class Σ1

1

  • f analytic sets which are obtained by projection of Borel sets.

A set E ⊆ Σω is in the class Σ1

1 iff :

∃F ⊆ (Σ × {0, 1})ω such that F is Π0

2 and

E is the projection of F onto Σω A set E ⊆ Σω is in the class Π1

1 iff Σω − E is in Σ1 1.

Suslin’s Theorem states that : Borel sets = ∆1

1 = Σ1 1 ∩ Π1 1

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-19
SLIDE 19

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider RNNs with Boolean input and output cells, sigmoidal internal cells, and working on infinite input streams.

· · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells

The attractors are assumed to be classified into two possible kinds: accepting or rejecting.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-20
SLIDE 20

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider RNNs with Boolean input and output cells, sigmoidal internal cells, and working on infinite input streams.

· · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells

The attractors are assumed to be classified into two possible kinds: accepting or rejecting.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-21
SLIDE 21

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider RNNs with Boolean input and output cells, sigmoidal internal cells, and working on infinite input streams.

· · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells

The attractors are assumed to be classified into two possible kinds: accepting or rejecting.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-22
SLIDE 22

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider RNNs with Boolean input and output cells, sigmoidal internal cells, and working on infinite input streams.

· · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells

◮ The attractors are assumed to be classified into two possible

kinds: accepting or rejecting.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-23
SLIDE 23

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider RNNs with Boolean input and output cells, sigmoidal internal cells, and working on infinite input streams.

· · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells

◮ An infinite Boolean input stream is accepted by N if the cor-

responding Boolean output stream visits a accepting attractor.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-24
SLIDE 24

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider RNNs with Boolean input and output cells, sigmoidal internal cells, and working on infinite input streams.

· · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells

◮ An infinite Boolean input stream is rejected by N if the corre-

sponding Boolean output stream visits a rejecting attractor.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-25
SLIDE 25

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider RNNs with Boolean input and output cells, sigmoidal internal cells, and working on infinite input streams.

· · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells

◮ The set of all input streams that are accepted by N is the

ω-language recognized by N.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-26
SLIDE 26

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider two models of deterministic NNs:

  • 1. static rational NNs:

D-St-NN[Q]s

  • 2. bi-valued evolving rational NNs:

D-Ev2-NN[Q]s

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-27
SLIDE 27

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider two models of deterministic NNs:

  • 1. static rational NNs:

D-St-NN[Q]s

  • 2. bi-valued evolving rational NNs:

D-Ev2-NN[Q]s

ci ai1 ai2 aiN biM bi1

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-28
SLIDE 28

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider two models of deterministic NNs:

  • 1. static rational NNs:

D-St-NN[Q]s

  • 2. bi-valued evolving rational NNs:

D-Ev2-NN[Q]s

ci(t) ai1(t) ai2(t) aiN(t) biM(t) bi1(t)

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-29
SLIDE 29

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

We consider two models of deterministic NNs:

  • 1. static rational NNs:

D-St-NN[Q]s

  • 2. bi-valued evolving rational NNs:

D-Ev2-NN[Q]s

ci(t) ai1(t) ai2(t) aiN(t) biM(t) bi1(t)

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-30
SLIDE 30

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Relationship between the models:

D-Ev-RNN[R]s D-Ev2-RNN[R]s D-St-RNN[R]s D-St-RNN[Q]s D-Ev2-RNN[Q]s D-Ev-RNN[Q]s = BC(Π0

2)-sets

Super-Turing = BC(Π0

2)-sets

Turing (Muller)

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-31
SLIDE 31

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Theorem (Staiger (1997), Cabessa & Villa (2016))

Let L ⊆ (BM)ω. The following conditions are equivalent.

◮ L ∈ BC(Π0 2) ◮ L is recognizable by some deterministic Muller TM ◮ L is recognizable by some D-St-NN[Q]

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-32
SLIDE 32

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Theorem (Cabessa & Villa (2016))

Let L ⊆ (BM)ω. The following conditions are equivalent.

◮ L ∈ BC(Π0 2); ◮ L is recognizable by some D-Ev2-NN[Q];

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-33
SLIDE 33

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results – Summary

DET. Static Bi-valued Evolving D-St-NN[Q]s D-Ev2-NN[Q]s = BC(Π0

2)

= BC(Π0

2)

Turing (Muller) super-Turing

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-34
SLIDE 34

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Deterministic ω-NNs

◮ We consider bi-valued evolving rational NNs with only one

evolving weight: D-Ev2-NN[Q, α]s, where α ∈ {0, 1}ω

ci ai1 ai2(t) ∈ {0, 1}ω aiN biM bi1

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-35
SLIDE 35

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Relativized Classes

If L ⊆ Xω and Γ is a topological class, then for α ∈ {0, 1}ω, L ∈ Γ(α) iff there exists some L′ ⊆ (X × {0, 1})ω such that L′ ∈ Γ, and [(∀x ∈ Xω) x ∈ L ⇐ ⇒ (x, α) ∈ L′] i.e. L ∈ Γ(α) iff L is the section of L′ ∈ Γ in α

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-36
SLIDE 36

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Theorem

Let L ⊆ (BM)ω. The following conditions are equivalent.

◮ L ∈ BC(Π0 2)(α), for some

α ∈ {0, 1}ω

◮ L is recognizable by some

D-Ev2-NN[Q, α]

BC(Π0

2)

BC(Π0

2) BC(Π0

2)(α)

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-37
SLIDE 37

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Proposition

There exist some infinite sequence (αk)k<ω1, where each αi ∈ {0, 1}ω, such that BC(Π0

2)(αi) BC(Π0 2)(αj)

for all i < j < ω1.

BC(Π0

2)

BC(Π0

2) BC(Π0

2)(α0)

BC(Π0

2)(α1)

BC(Π0

2)(α2)

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-38
SLIDE 38

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

The NNs are provided with an additional Boolean guess cell.

· · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells Guess cell · · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · Guess stream

The attractors are assumed to be classified into two possible kinds: accepting or rejecting.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-39
SLIDE 39

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

The NNs are provided with an additional Boolean guess cell.

· · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells Guess cell · · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · Guess stream

The attractors are assumed to be classified into two possible kinds: accepting or rejecting.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-40
SLIDE 40

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

The NNs are provided with an additional Boolean guess cell.

· · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells Guess cell · · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · Guess stream

The attractors are assumed to be classified into two possible kinds: accepting or rejecting.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-41
SLIDE 41

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

The NNs are provided with an additional Boolean guess cell.

· · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells Guess cell · · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · Guess stream

◮ The attractors are assumed to be classified into two possible

kinds: accepting or rejecting.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-42
SLIDE 42

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

The NNs are provided with an additional Boolean guess cell.

· · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells Guess cell · · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · Guess stream

◮ Input stream s ∈ (BM)ω accepted by N iff there exists some

guess g ∈ Bω s.t. N(s, g) enters an accepting attractor.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-43
SLIDE 43

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

The NNs are provided with an additional Boolean guess cell.

· · · · · · Boolean input cells Boolean

  • utput

cells Sigmoid internal cells Guess cell · · · · · · Attractor (periodic) Infinite Boolean

  • utput stream

Infinite Boolean input stream · · · Guess stream

◮ Input stream s ∈ (BM)ω rejected by N iff for all guess g ∈ Bω,

N(s, g) does not enter an accepting attractor.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-44
SLIDE 44

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

We consider two models of nondeterministic NNs:

  • 1. static rational NNs:

N-St-NN[Q]s

  • 2. bi-valued evolving rational NNs:

N-Ev2-NN[Q]s

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-45
SLIDE 45

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

We consider two models of nondeterministic NNs:

  • 1. static rational NNs:

N-St-NN[Q]s

  • 2. bi-valued evolving rational NNs:

N-Ev2-NN[Q]s

ci ai1 ai2 aiN biM bi1

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-46
SLIDE 46

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

We consider two models of nondeterministic NNs:

  • 1. static rational NNs:

N-St-NN[Q]s

  • 2. bi-valued evolving rational NNs:

N-Ev2-NN[Q]s

ci(t) ai1(t) ai2(t) aiN(t) biM(t) bi1(t)

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-47
SLIDE 47

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-NNs

We consider two models of nondeterministic NNs:

  • 1. static rational NNs:

N-St-NN[Q]s

  • 2. bi-valued evolving rational NNs:

N-Ev2-NN[Q]s

ci(t) ai1(t) ai2(t) aiN(t) biM(t) bi1(t)

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-48
SLIDE 48

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Relationship between the models:

N-Ev-RNN[R]s N-Ev2-RNN[R]s N-St-RNN[R]s N-St-RNN[Q]s N-Ev2-RNN[Q]s N-Ev-RNN[Q]s = Σ1

1-sets

Super-Turing = Σ1

1-sets

Turing (Muller)

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-49
SLIDE 49

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Theorem (Staiger (1997), Cabessa & Villa (2016))

Let L ⊆ (BM)ω. The following conditions are equivalent.

◮ L ∈ Σ1 1 ◮ L is recognizable by some nondeterministic Muller TM ◮ L is recognizable by some N-St-NN[Q]

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-50
SLIDE 50

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Theorem (Cabessa & Villa (2016))

Let L ⊆ (BM)ω. The following conditions are equivalent.

◮ L ∈ Σ1 1; ◮ L is recognizable by some N-Ev2-NN[Q];

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-51
SLIDE 51

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results – Summary

NONDET. Static Bi-valued Evolving N-St-NN[Q]s N-Ev2-NN[Q]s = Σ1

1

= Σ1

1

Turing (Muller) super-Turing

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-52
SLIDE 52

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Nondeterministic ω-RNNs

  • 1. We consider bi-valued evolving rational NNs with only one

evolving weight: N-Ev2-NN[Q, α]s, where α ∈ {0, 1}ω

ci ai1 ai2(t) ∈ {0, 1}ω aiN biM bi1

1 1

σ

xi neuron

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-53
SLIDE 53

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Theorem

Let L ⊆ (BM)ω. The following conditions are equivalent.

◮ L ∈ Σ1 1(α), for some

α ∈ {0, 1}ω

◮ L is recognizable by some

N-Ev2-NN[Q, α]

Σ1

1 Σ1

1(α)

Σ1

1

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-54
SLIDE 54

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Results

Proposition

There exist some infinite sequence (αk)k<ω1, where each αi ∈ {0, 1}ω, such that Σ1

1(αi) Σ1 1(αj)

for all i < j < ω1.

Σ1

1

Σ1

1 Σ1

1(α0)

Σ1

1(α1)

Σ1

1(α2)

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

slide-55
SLIDE 55

Introduction RNNs Muller TMs Topology

  • Det. ω-NNs
  • Nondet. ω-RNNs

Conclusion

Conclusion

◮ We provided a precise characterization of the expressive power

  • f evolving neural networks employing only one evolving weight.

◮ As a consequence, a proper hierarchy of classes of evolving

neural nets, based on the complexity of their underlying evolving weights, can be obtained.

◮ The hierarchy contains chains of length ω1 as well as uncount-

able antichains.

◮ The super-Turing computational capabilities of neural models

is related to the issue of hypercomputation.

◮ Current physical theories are consistent with the possibility of

hypercomputational systems (quantum, relativistic, etc.). No such systems are currently feasible or harnessable.

Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel