expressive power of evolving neural networks working on
play

Expressive Power of Evolving Neural Networks Working on Infinite - PowerPoint PPT Presentation

Introduction RNNs Muller TMs Topology Det. -NNs Nondet. -RNNs Conclusion Expressive Power of Evolving Neural Networks Working on Infinite Input Streams Olivier Finkel 1 emie Cabessa 2 Joint work with J er 1 CNRS and University


  1. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Expressive Power of Evolving Neural Networks Working on Infinite Input Streams Olivier Finkel 1 emie Cabessa 2 Joint work with J´ er´ 1 CNRS and University Paris 7 2 Laboratoire d’´ Economie Math´ ematique, Universit´ e Paris 2 FCT 2017, Bordeaux 12 Septembre 2017 Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  2. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Introduction ◮ The computational capabilities of recurrent neural networks have mainly been studied in the context of classical computa- tion: McCulloch & Pitts (1943), Turing (1948), Kleene (1956), von Neumann (1958), Minsky (1967), Papert (1969),. . . , Siegel- mann & Sontag (1994-1995),. . . ◮ We provide a characterization of the computational power of recurrent neural networks in terms of their attractor dynamics, i.e., in the context of infinite input stream computation. Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  3. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Introduction ◮ The computational capabilities of recurrent neural networks have mainly been studied in the context of classical computa- tion: McCulloch & Pitts (1943), Turing (1948), Kleene (1956), von Neumann (1958), Minsky (1967), Papert (1969),. . . , Siegel- mann & Sontag (1994-1995),. . . ◮ We provide a characterization of the computational power of recurrent neural networks in terms of their attractor dynamics, i.e., in the context of infinite input stream computation. Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  4. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Recurrent Neural Network w9 w2 w13 w3 w15 w7 w11 w16 w4 w5 w12 w10 w14 w1 w6 w8 Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  5. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Boolean Neural Networks a i 1 a i 2 a iN neuron x i b i 1 1 θ 0 1 b iM c i   N M � � a ij · x j ( t ) + b ij · u j ( t ) + c i x i ( t + 1) = θ   j =1 j =1 Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  6. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Sigmoidal Neural Networks a i 1 a i 2 a iN neuron x i b i 1 1 σ 0 1 b iM c i   N M � � a ij · x j ( t ) + b ij · u j ( t ) + c i x i ( t + 1) = σ   j =1 j =1 Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  7. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Sigmoidal Neural Networks a i 1 ( t ) a i 2 ( t ) a iN ( t ) neuron x i b i 1 ( t ) 1 σ 0 1 b iM ( t ) c i ( t )   N M � � a ij ( t ) · x j ( t ) + b ij ( t ) · u j ( t ) + c i ( t ) x i ( t + 1) = σ   j =1 j =1 Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  8. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Neural Networks We consider three models of NNs: 1. Boolean rational NNs: B-NN[ Q ]s 2. Sigmoidal static rational NNs: St-NN[ Q ]s 3. Sigmoidal bi-valued evolving rational NNs: Ev 2 -NN[ Q ]s Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  9. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Neural Networks We consider three models of NNs: 1. Boolean rational NNs: B-NN[ Q ]s 2. Sigmoidal static rational NNs: St-NN[ Q ]s 3. Sigmoidal bi-valued evolving rational NNs: Ev 2 -NN[ Q ]s a i 1 a i 2 a iN neuron x i b i 1 1 θ 0 1 b iM c i Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  10. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Neural Networks We consider three models of NNs: 1. Boolean rational NNs: B-NN[ Q ]s 2. Sigmoidal static rational NNs: St-NN[ Q ]s 3. Sigmoidal bi-valued evolving rational NNs: Ev 2 -NN[ Q ]s a i 1 a i 2 a iN neuron x i b i 1 1 σ 0 1 b iM c i Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  11. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Neural Networks We consider three models of NNs: 1. Boolean rational NNs: B-NN[ Q ]s 2. Sigmoidal static rational NNs: St-NN[ Q ]s 3. Sigmoidal bi-valued evolving rational NNs: Ev 2 -NN[ Q ]s a i 1 ( t ) a i 2 ( t ) a iN ( t ) neuron x i b i 1 ( t ) 1 σ 0 1 b iM ( t ) c i ( t ) Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  12. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Results (Classical Computation) Boolean Static Bi-valued Evolving FSA TM TM/poly(A) REG P P/poly Kleene 56 Siegelmann & Cabessa & Minsky 67 Sontag 95 Siegelmann 11,14 Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  13. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Muller Turing Machine A Muller Turing machine consists of a classical TM with Muller acceptance condition. Muller table T : collection of accepting sets of states. ◮ The ω -word u is accepted by M if there is an infinite run ρ u of the machine M on u such that inf( ρ u ) ∈ T ◮ The ω -language accepted by M is the set of ω -words accepted by M . Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  14. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Complexity of ω -languages The question naturally arises of the complexity of ω -languages accepted by various kinds of automata. A way to study the complexity of ω -languages is to consider their topological complexity. Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  15. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Topology on Σ ω The natural prefix metric on the set Σ ω of ω -words over Σ is defined as follows: For u, v ∈ Σ ω and u � = v let δ ( u, v ) = 2 − n where n is the least integer such that: the ( n + 1) st letter of u is different from the ( n + 1) st letter of v . This metric induces on Σ ω the usual Cantor topology for which : ◮ open subsets of Σ ω are in the form W. Σ ω , where W ⊆ Σ ⋆ . ◮ closed subsets of Σ ω are complements of open subsets of Σ ω . Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  16. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Borel Hierarchy Σ 0 1 is the class of open subsets of Σ ω , Π 0 1 is the class of closed subsets of Σ ω , For any countable ordinal α ≥ 2 : α is the class of countable unions of subsets of Σ ω in � Σ 0 γ<α Π 0 γ . Π 0 α is the class of complements of Σ 0 α -sets ∆ 0 α = Π 0 α ∩ Σ 0 α . A set X ⊆ Σ ω is a Borel set iff it is in � α<ω 1 Σ 0 α<ω 1 Π 0 α = � α where ω 1 is the first uncountable ordinal. Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  17. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Borel Hierarchy Below an arrow → represents a strict inclusion between Borel classes. Π 0 Π 0 Π 0 1 α α +1 ր ց ր ր ց ր ∆ 0 ∆ 0 ∆ 0 ∆ 0 · · · · · · · · · 1 2 α α +1 ց ր ց ց ր ց Σ 0 Σ 0 Σ 0 1 α α +1 Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  18. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Beyond the Borel Hierarchy There are some subsets of Σ ω which are not Borel. Beyond the Borel hierarchy is the projective hierarchy. The class of Borel subsets of Σ ω is strictly included in the class Σ 1 1 of analytic sets which are obtained by projection of Borel sets. A set E ⊆ Σ ω is in the class Σ 1 1 iff : ∃ F ⊆ (Σ × { 0 , 1 } ) ω such that F is Π 0 2 and E is the projection of F onto Σ ω A set E ⊆ Σ ω is in the class Π 1 1 iff Σ ω − E is in Σ 1 1 . Suslin’s Theorem states that : Borel sets = ∆ 1 1 = Σ 1 1 ∩ Π 1 1 Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

  19. Introduction RNNs Muller TMs Topology Det. ω -NNs Nondet. ω -RNNs Conclusion Deterministic ω -NNs We consider RNNs with Boolean input and output cells, sigmoidal internal cells, and working on infinite input streams. Boolean Sigmoid Boolean · · · input internal output · · · cells cells cells Infinite Boolean · · · input stream Infinite Boolean output stream Attractor (periodic) The attractors are assumed to be classified into two possible kinds: accepting or rejecting . Expressive Power of Evolving Neural Networks J´ er´ emie Cabessa & Olivier Finkel

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend