turing networks complexity theory for
play

Turing Networks: Complexity Theory for Network? Message-Passing - PowerPoint PPT Presentation

Turing Networks Jack Romo Introduction So Whats a Turing Turing Networks: Complexity Theory for Network? Message-Passing Parallelism Lower Bounds Upper Bounds Simulation Conclusions Jack Romo University of York jr1161@york.ac.uk


  1. Turing Networks Jack Romo Introduction So What’s a Turing Turing Networks: Complexity Theory for Network? Message-Passing Parallelism Lower Bounds Upper Bounds Simulation Conclusions Jack Romo University of York jr1161@york.ac.uk April 2019

  2. Turing Networks Overview Jack Romo Introduction So What’s a Turing 1 Introduction Network? Lower Bounds 2 So What’s a Turing Network? Upper Bounds Simulation Conclusions 3 Lower Bounds 4 Upper Bounds 5 Simulation 6 Conclusions

  3. Turing Networks Complexity Theory Jack Romo Introduction So What’s a Turing • The study of resource requirements to solve problems Network? • eg. Time, memory, ... Lower Bounds • Alternatively, the study of complexity classes , ie. sets of Upper Bounds problems with the same computational overhead Simulation • How are they related? Conclusions • eg. P ⊆ NP • Usually studied in context of Turing Machines • Large amount of theory developed here over the last century • Would be ideal to reuse it in other problems • Parallel complexity theory?

  4. Turing Networks Parallel Complexity Theory Jack Romo Introduction So What’s a Turing Network? • Complexity of parallel computations are extremely Lower Bounds nontrivial Upper Bounds • Turing Machines to parallelism: Parallel Turing Machines Simulation • Same, but can create new identical read-write heads at any Conclusions step

  5. Turing Networks Parallel Complexity Theory Jack Romo Introduction So What’s a Turing Network? • Complexity of parallel computations are extremely Lower Bounds nontrivial Upper Bounds • Turing Machines to parallelism: Parallel Turing Machines Simulation • Same, but can create new identical read-write heads at any Conclusions step • However, this models shared memory only, not message passing • Other models can emulate message passing parallelism, but do not connect to classical complexity theory

  6. Turing Networks Parallel Complexity Theory Jack Romo Introduction So What’s a Turing Network? • Complexity of parallel computations are extremely Lower Bounds nontrivial Upper Bounds • Turing Machines to parallelism: Parallel Turing Machines Simulation • Same, but can create new identical read-write heads at any Conclusions step • However, this models shared memory only, not message passing • Other models can emulate message passing parallelism, but do not connect to classical complexity theory • This is a problem!

  7. Turing Networks Modelling Message-Passing Jack Romo Parallelism Introduction So What’s a Turing Network? Lower Bounds Upper Bounds We would like a model that... Simulation 1 Intuitively emulates a network of communicating Conclusions processors. 2 Allows for substantial complexity analysis. 3 Relates classical complexity classes to its own parallel ones. 4 Can simulate other models of parallelism with good complexity.

  8. Turing Networks Network topologies Jack Romo Introduction So What’s a Turing Network? • Take a simple undirected graph G = � V , E � of constant Lower Bounds degree, 1 ∈ V ⊆ N , E ⊆ V × V symmetric relation Upper Bounds • Choose a vertex, index its neighbors from 1 to n ; do this Simulation for every vertex Conclusions • Call this the graph’s orientation , φ : V × N � → V • Call a pair G = � G , φ � a network topology

  9. Turing Networks Network topologies Jack Romo Introduction So What’s a Turing Network? • Take a simple undirected graph G = � V , E � of constant Lower Bounds degree, 1 ∈ V ⊆ N , E ⊆ V × V symmetric relation Upper Bounds • Choose a vertex, index its neighbors from 1 to n ; do this Simulation for every vertex Conclusions • Call this the graph’s orientation , φ : V × N � → V • Call a pair G = � G , φ � a network topology • eg. Linked List • V = N • E = { ( v , v + 1 ) | v ∈ V } ’s symmetric closure • φ ( 1 , 1 ) = 1 • φ ( n + 1 , 1 ) = n , φ ( n + 1 , 2 ) = n + 2 , n ∈ N

  10. Turing Networks Communicative Turing Machines Jack Romo Introduction So What’s a Turing Network? Lower Bounds • Each vertex is a Turing Machine Upper Bounds • However, need some capacity to communicate Simulation Conclusions • Add ’special transitions’ to send/receive a character • Index neighbors to commune with by orientation • Two start states, a ’master’ state for vertex 1 and ’slave’ state for the rest • Slaves must start by waiting for a message

  11. Turing Networks Communicative Turing Machines Jack Romo Introduction So What’s a Definition (Communicative Turing Machine) Turing Network? A Communicative Turing Machine , or CTM, is a 10-tuple Lower Bounds T = � Q , Σ , Γ , q m , q s , h a , h r , δ t , δ s , δ r � Upper Bounds Simulation where Q , Σ are nonempty and finite, Σ ∪ { Λ } ⊂ Γ , Conclusions q m , q s , h a , h r ∈ Q , and δ t : Q × Γ � → Q × Γ × { L , R , S } → N × Γ × Q 2 δ s : Q × Γ � δ r : Q � → N × Q are partial functions, where δ t ( q s , x ) , δ s ( q s , x ) are undefined ∀ x ∈ Γ .

  12. Turing Networks Turing Networks Jack Romo Introduction So What’s a Turing Network? Lower Bounds Upper Bounds Simulation Definition (Turing Network) Conclusions A Turing Network is a pair T = � G , T � , such that G is a network topology and T is a Communicative Turing machine.

  13. Turing Networks Defining Computations Jack Romo Introduction So What’s a Turing Network? Lower Bounds Upper Bounds • Define a configuration of one CTM and a transition Simulation Conclusions • Extend to a configuration/transition of a TN • Derivation sequences • Computations as terminating derivation sequences

  14. Turing Networks Configurations Jack Romo Introduction So What’s a Turing Definition (CTM Configuration) Network? A CTM configuration of a CTM T is a 4-tuple of the form Lower Bounds C ∈ Γ ∗ × Γ × Q × Γ ∗ . We name the set of all CTM Upper Bounds Simulation configurations for the CTM T C ( T ) . Conclusions We say, for CTM configurations C n = � r n , s n , q n , t n � , n ∈ N , a network topology G = � V , E � and v 1 , v 2 ∈ V , C 1 ⊢ C 2 ⇔ C 1 transitions to C 2 as TM configs � C 1 , C 2 � ⊢ v 2 v 1 � C 3 , C 4 � ⇔ v 1 in config C 1 sends a char to v 2 in config C 2 , transitioning to C 3 and C 4 C 1 � v 1 C 2 ⇔ v 1 sends to a nonexistent neighbor

  15. Turing Networks Configurations Jack Romo Introduction Definition (TN Configuration) So What’s a Turing Network? A TN configuration of a TN T is a function of the form Lower Bounds Ω : V → C ( T ) . Upper Bounds We say, for CTM configurations Ω n , n ∈ N of a Turing network Simulation T and v 1 , v 2 ∈ V , Conclusions Ω 1 ⊢ v 1 Ω 2 ⇔ Ω 1 | V \{ v 1 } = Ω 2 | V \{ v 1 } ∧ (Ω 1 ( v 1 ) ⊢ Ω 2 ( v 1 ) ∨ Ω 1 ( v 1 ) � v 1 Ω 2 ( v 1 )) Ω 1 ⊢ v 2 v 1 Ω 2 ⇔ Ω 1 | V \{ v 1 } = Ω 2 | V \{ v 1 } ∧ � Ω 1 ( v 1 ) , Ω 1 ( v 2 ) � ⊢ v 2 v 1 � Ω 2 ( v 1 ) , Ω 2 ( v 2 ) � Ω 1 ⊢ Ω 2 ⇔ ( ∃ v ∈ V • Ω 1 ⊢ v Ω 2 ) ∨ ( ∃ v 1 , v 2 ∈ V • Ω 1 ⊢ v 2 v 1 Ω 2 )

  16. Turing Networks Initial and Final States Jack Romo Introduction So What’s a Definition (Initial State) Turing Network? An initial state of a TN T is a configuration Ω S for some Lower Bounds S ∈ Σ ∗ where Upper Bounds Simulation Ω S ( 1 ) = � λ, Λ , q m , S � Conclusions Ω S ( n + 1 ) = � λ, Λ , q s , λ � ∀ n ∈ N Definition (Final State) A final state of T is a configuration Ω h where Ω h ( 1 ) = � A , b , q , C � where q ∈ { h a , h r } . The output string is AbC with all characters not in Σ deleted. We say Ω h is accepting if q = h a and rejecting otherwise.

  17. Turing Networks Computations Jack Romo Introduction So What’s a Turing Network? Definition (Derivation Sequence) Lower Bounds A derivation sequence Ψ = { Ω n } n ∈ X is a sequence of indexed Upper Bounds configurations of T where X ⊆ N and for any n , m ∈ X , Simulation Ω n ⊢ Ω m if m is the least element of X greater than n. Conclusions Say that, for two derivation sequences of T , Ψ 1 , Ψ 2 , Ψ 1 < Ψ 2 if the former is a prefix of the latter as a sequence. Definition (Computation) We say a derivation sequence Ψ is a computation if it starts with an initial state and ends with a final state.

  18. Turing Networks Acceptance, Rejection and Jack Romo Computing Functions Introduction So What’s a Turing Network? Definition (Acceptance and Rejection) Lower Bounds We say T accepts a string S ∈ Σ ∗ if every derivation sequence Upper Bounds Simulation starting with Ω S is less than an accepting computation and all Conclusions rejecting computations are greater than an accepting computation. We say it rejects if there exists a rejecting computation not greater than some accepting computation. Definition (Computing Functions) Say T computes a function f : Σ ∗ � → Σ ∗ if, for every input string s ∈ dom ( f ) , every computation of T with input string s has a final state with output string f ( s ) .

  19. Turing Networks Our Time Function Jack Romo Introduction So What’s a Turing Network? Lower Bounds • Insufficient to analyze length of computations; things Upper Bounds happen in parallel! Simulation • Define parallel sequences Conclusions • Analyze number of parallel sequences a computation is itself a sequence of • τ : Σ ∗ → N gives longest parallel time of any computation starting with an input string

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend