synchronous computations basic techniques secs 6 1 6 2
play

Synchronous Computations, Basic techniques (Secs. 6.1-6.2) - PowerPoint PPT Presentation

Synchronous Systems Synchronous Algorithms Communicators, pipeline, and transformers Synchronous Computations, Basic techniques (Secs. 6.1-6.2) T-79.4001 Seminar on Theoretical Computer Science Aleksi Hnninen 21.03.2007 Aleksi Hnninen


  1. Synchronous Systems Synchronous Algorithms Communicators, pipeline, and transformers Synchronous Computations, Basic techniques (Secs. 6.1-6.2) T-79.4001 Seminar on Theoretical Computer Science Aleksi Hänninen 21.03.2007 Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  2. Synchronous Systems Synchronous Algorithms Communicators, pipeline, and transformers Outline Synchronous Systems Definitions New Restrictions Synchronous Algorithms Speed TwoBits The Cost of Synchronous Protocols Communicators, pipeline, and transformers Two-Party Communication Problem Pipeline Asynchronous-to-Synchronous Transformation Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  3. Synchronous Systems Definitions Synchronous Algorithms New Restrictions Communicators, pipeline, and transformers Outline Synchronous Systems Definitions New Restrictions Synchronous Algorithms Speed TwoBits The Cost of Synchronous Protocols Communicators, pipeline, and transformers Two-Party Communication Problem Pipeline Asynchronous-to-Synchronous Transformation Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  4. Synchronous Systems Definitions Synchronous Algorithms New Restrictions Communicators, pipeline, and transformers Notation ◮ n is the number of nodes, m is the number of edges ◮ Standard set of restrictions R = { Bidirectional Links , Connectivity , Total Reliability } ◮ M [ P ] is the number of messages needed in protocol P ◮ T [ P ] is the time required in protocol P ◮ B [ P ] is the number of bits needed in protocol P ◮ � T , B � is the total cost of protocol Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  5. Synchronous Systems Definitions Synchronous Algorithms New Restrictions Communicators, pipeline, and transformers Restrictions for fully synchronous system 6.1.1. Synchronized Clocks ◮ All entities have a clock, which are incremented by one unit δ simultaneosly. Clocks are not necessarily at the same time. ◮ All messages are transmitted to their neighbors only at the strike of a clock tick ◮ At each clock tick, an entity will send at most one message to the same neighbor 6.1.2 Bounded Communication Delays ◮ There exists a known upper bound on the communication delays ∆ System which satisfies these is a fully synchronous system. The messages are called packets, and have a bounded size c . Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  6. Synchronous Systems Definitions Synchronous Algorithms New Restrictions Communicators, pipeline, and transformers Synchronous restriction: Synch Every synchronous system with 6.1.1. and 6.1.2 can be “normalized” so that a communication delay is one new tick δ (assuming that entities know the time when to begin ticking) 6.1.3 Unitary Communication Delays ◮ In absence of failures, a transmitted message will arrive and be processed after at most one clock tick 6.1.1 + 6.1.3. = Synch ◮ Messages sent in time t is received at time t + 1 ◮ Simplifies situation Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  7. Synchronous Systems Speed Synchronous Algorithms TwoBits Communicators, pipeline, and transformers The Cost of Synchronous Protocols Outline Synchronous Systems Definitions New Restrictions Synchronous Algorithms Speed TwoBits The Cost of Synchronous Protocols Communicators, pipeline, and transformers Two-Party Communication Problem Pipeline Asynchronous-to-Synchronous Transformation Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  8. Synchronous Systems Speed Synchronous Algorithms TwoBits Communicators, pipeline, and transformers The Cost of Synchronous Protocols Synchronous Minimum Finding AsFar + delay in send = Speed ◮ IR ∪ Synch ∪ Ring ∪ InputSize ( 2 c ) ◮ AsFar : Optimal message complexity on average but worst-case is O ( n 2 ) . ◮ Idea: Delay forwarding messages with large id to allow smaller id messages catch it up. ◮ Delay is f ( i ) which is monotonical ◮ Correctness: Basically because AsFar is (message with smallest id is newer trashed) Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  9. Synchronous Systems Speed Synchronous Algorithms TwoBits Communicators, pipeline, and transformers The Cost of Synchronous Protocols Complexity If f ( i ) := 2 i M [ Speed ] = � n n 2 j − 1 < 2 n j = 1 In the time when first has went through, second largest has went at most half the way, third largest n / 4 . . . M [ Speed ] = O ( n ) ! ◮ Message complexity smaller than the O ( n log ( n )) lower bound of asynchronous rings ◮ However, T [ Speed ] = O ( n 2 i ) ◮ Exponential to the input values, worse than to size n . Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  10. Synchronous Systems Speed Synchronous Algorithms TwoBits Communicators, pipeline, and transformers The Cost of Synchronous Protocols TwoBits ◮ Situation: x wants to send bits α to his neighbor y with only two bits. ◮ Int ( 1 α ) is some known bijection between bit strings and integers. ◮ Silence is information also! Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  11. Synchronous Systems Speed Synchronous Algorithms TwoBits Communicators, pipeline, and transformers The Cost of Synchronous Protocols 1. Entity 1. Send “start counting” 2. Wait Int ( 1 α ) ticks 3. Send “stop counting” 2. Entity 1. Upon receiving “start counting”, record current time c 1 2. When “stop counting” is received, calculate the difference of current time c 2 and c 1 and use the fact that c 2 − c 1 = Int ( 1 α ) , from which the α can be deduced. Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  12. Synchronous Systems Speed Synchronous Algorithms TwoBits Communicators, pipeline, and transformers The Cost of Synchronous Protocols Total Cost ◮ The cost of a fully synchronous protocol is both time and transmissions. ◮ Time can be saved by using more bits, and bits can be saved by using more time. Cost [ Protocol P ] = � B [ P ] , T [ P ] � Cost [ Speed ( i )] = �O ( nlog ( i )) , O ( n 2 i ) � Cost [ TwoBits ( α )] = � 2 , O ( 2 | α | ) � Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  13. Synchronous Systems Two-Party Communication Problem Synchronous Algorithms Pipeline Communicators, pipeline, and transformers Asynchronous-to-Synchronous Transformation Outline Synchronous Systems Definitions New Restrictions Synchronous Algorithms Speed TwoBits The Cost of Synchronous Protocols Communicators, pipeline, and transformers Two-Party Communication Problem Pipeline Asynchronous-to-Synchronous Transformation Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  14. Synchronous Systems Two-Party Communication Problem Synchronous Algorithms Pipeline Communicators, pipeline, and transformers Asynchronous-to-Synchronous Transformation Two-Party Communication Problem TPC ◮ In fully synchronous systems, also the absence of transmission can be used to convey information between the nodes. ◮ There are many solutions to the Two-Party Communication Problem, called communicators . ◮ Time and bit cost of sending information I between neighbours using C are denoted Time ( C , I ) and Bit ( C , I ) , respectively. Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  15. Synchronous Systems Two-Party Communication Problem Synchronous Algorithms Pipeline Communicators, pipeline, and transformers Asynchronous-to-Synchronous Transformation TPC settings ◮ Sender will send k packets which defines k − 1 quanta of time q 1 , . . . , q k − 1 . ◮ The ordered sequence � p 0 : q 1 : · · · : q k − 1 : p k − 1 � is called a communication sequence. ◮ A protocol communicator C must specify an encoding function which encodes information to a communication sequence and a decoding function to decode it. Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  16. Synchronous Systems Two-Party Communication Problem Synchronous Algorithms Pipeline Communicators, pipeline, and transformers Asynchronous-to-Synchronous Transformation 2-bit Communicator Protocol C 2 ◮ encode ( i ) = � b 0 : i : b 1 � ◮ decode ( b 0 : q 1 : b 1 ) = q 1 ◮ Cost [ C 2 ( i )] = � 2 , i + 2 � Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

  17. Synchronous Systems Two-Party Communication Problem Synchronous Algorithms Pipeline Communicators, pipeline, and transformers Asynchronous-to-Synchronous Transformation Hacking ◮ The values of bits are not used, so they can be changed. The protocol is corruption tolerant . ◮ We can also encode 2 bits to the values of b 0 and b 1 . The new protocol is called R 2 ( i ) and the time reduces to 2 + i 4 . ◮ From now on, we restrict ourselves to corruption tolerant TPC, and denote the communication sequence with only the quantas q i as a tuple � q 1 : · · · : q k � Aleksi Hänninen Synchronous Computations, Basic techniques (Secs. 6.1-6.2)

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend