local distributed decision
play

Local Distributed Decision Pierre Fraigniaud Amos Korman David - PowerPoint PPT Presentation

Local Distributed Decision Pierre Fraigniaud Amos Korman David Peleg CNRS and University Paris Diderot Workshop on Sublinear Algorithms, Bertinoro, May 23-27, 2011 1 / 33 Decision problems Does randomization helps? Nondeterminism Power of


  1. Local Distributed Decision Pierre Fraigniaud Amos Korman David Peleg CNRS and University Paris Diderot Workshop on Sublinear Algorithms, Bertinoro, May 23-27, 2011 1 / 33

  2. Decision problems Does randomization helps? Nondeterminism Power of oracles Further works 2 / 33

  3. Outline Decision problems Does randomization helps? Nondeterminism Power of oracles Further works 3 / 33

  4. Decide coloring 4 / 33

  5. Computational model LOCAL model In each round during the execution of a distributed algorithm, every processor: 1. sends messages to its neighbors, 2. receives messages from its neighbors, and 3. computes, i.e., performs individual computations. 5 / 33

  6. Computational model LOCAL model In each round during the execution of a distributed algorithm, every processor: 1. sends messages to its neighbors, 2. receives messages from its neighbors, and 3. computes, i.e., performs individual computations. Input An input configuration is a pair ( G , x ) where G is a connected graph, and every node v ∈ V ( G ) is assigned as its local input a binary string x ( v ) ∈ { 0 , 1 } ∗ . 5 / 33

  7. Computational model LOCAL model In each round during the execution of a distributed algorithm, every processor: 1. sends messages to its neighbors, 2. receives messages from its neighbors, and 3. computes, i.e., performs individual computations. Input An input configuration is a pair ( G , x ) where G is a connected graph, and every node v ∈ V ( G ) is assigned as its local input a binary string x ( v ) ∈ { 0 , 1 } ∗ . Output The output of node v performing Algorithm A running in G with input x and identity assignment Id: out A ( G , x , Id , v ) 5 / 33

  8. Languages A distributed language is a decidable collection of configurations. 6 / 33

  9. Languages A distributed language is a decidable collection of configurations. ◮ Coloring = { ( G , x ) s.t. ∀ v ∈ V ( G ) , ∀ w ∈ N ( v ) , x ( v ) � = x ( w ) } . 6 / 33

  10. Languages A distributed language is a decidable collection of configurations. ◮ Coloring = { ( G , x ) s.t. ∀ v ∈ V ( G ) , ∀ w ∈ N ( v ) , x ( v ) � = x ( w ) } . ◮ At-Most-One-Marked = { ( G , x ) s.t. � x � 1 ≤ 1 } . 6 / 33

  11. Languages A distributed language is a decidable collection of configurations. ◮ Coloring = { ( G , x ) s.t. ∀ v ∈ V ( G ) , ∀ w ∈ N ( v ) , x ( v ) � = x ( w ) } . ◮ At-Most-One-Marked = { ( G , x ) s.t. � x � 1 ≤ 1 } . ◮ Consensus = { ( G , ( x 1 , x 2 )) s.t. ∃ u ∈ V ( G ) , ∀ v ∈ V ( G ) , x 2 ( v ) = x 1 ( u ) } . 6 / 33

  12. Languages A distributed language is a decidable collection of configurations. ◮ Coloring = { ( G , x ) s.t. ∀ v ∈ V ( G ) , ∀ w ∈ N ( v ) , x ( v ) � = x ( w ) } . ◮ At-Most-One-Marked = { ( G , x ) s.t. � x � 1 ≤ 1 } . ◮ Consensus = { ( G , ( x 1 , x 2 )) s.t. ∃ u ∈ V ( G ) , ∀ v ∈ V ( G ) , x 2 ( v ) = x 1 ( u ) } . ◮ MIS = { ( G , x ) s.t. S = { v ∈ V ( G ) | x ( v ) = 1 } is a MIS } . 6 / 33

  13. Decision Let L be a distributed language. Algorithm A decides L ⇐ ⇒ for every configuration ( G , x ) : ◮ If ( G , x ) ∈ L , then for every identity assignment Id, out A ( G , x , Id , v ) = “yes” for every node v ∈ V ( G ) ; ◮ If ( G , x ) / ∈ L , then for every identity assignment Id, out A ( G , x , Id , v ) = “no” for at least one node v ∈ V ( G ) . 7 / 33

  14. Local decision Let t be a function of triplets ( G , x , Id ) . Definition LD ( t ) is the class of all distributed languages that can be decided by a distributed algorithm that runs in at most t communication rounds. 8 / 33

  15. Local decision Let t be a function of triplets ( G , x , Id ) . Definition LD ( t ) is the class of all distributed languages that can be decided by a distributed algorithm that runs in at most t communication rounds. ◮ Coloring ∈ LD ( 1 ) and MIS ∈ LD ( 1 ) . ◮ AMOM , Consensus , and SpanningTree are not in LD ( t ) , for any t = o ( n ) . 8 / 33

  16. Outline Decision problems Does randomization helps? Nondeterminism Power of oracles Further works 9 / 33

  17. Related work What can be computed locally? Define LCL as LD ( O ( 1 )) involving ◮ solely graphs of constant maximum degree ◮ inputs taken from a set of constant size Theorem ( Naor and Stockmeyer [STOC ’93] ) If there exists a randomized algorithm that constructs a solution for a problem in LCL in O ( 1 ) rounds, then there is also a deterministic algorithm constructing a solution for that problem in O ( 1 ) rounds. Proof uses Ramsey theory. Not clearly extendable to languages in LD ( O ( 1 )) \ LCL. 10 / 33

  18. (∆ + 1 ) -coloring Arbitrary graphs ◮ can be randomly computed in expected #rounds O ( log n ) (Alon, Babai, Itai [J. Alg. 1986]) (Luby [SIAM J. Comput. 1986]) ◮ best known deterministic algorithm performs in 2 O ( √ log n ) rounds (Panconesi, Srinivasan [J. Algorithms, 1996]) Bounded degree graphs ◮ Randomization does not help for 3-coloring the ring (Naor [SIAM Disc. Maths 1991]) ◮ can be randomly computed in expected #rounds � O ( log ∆ + log n ) (Schneider, Wattenhofer [PODC 2010]) ◮ best known deterministic algorithm performs in O (∆ + log ∗ n ) rounds (Barenboim, Elkin [STOC 2009]) (Kuhn [SPAA 2009]) 11 / 33

  19. 2-sided error Monte Carlo algorithms Focus on distributed algorithms that use randomization but whose running time are deterministic. 12 / 33

  20. 2-sided error Monte Carlo algorithms Focus on distributed algorithms that use randomization but whose running time are deterministic. ( p , q ) -decider ◮ If ( G , x ) ∈ L then, for every identity assignment Id, Pr [ out A ( G , x , Id , v ) = “yes” for every node v ∈ V ( G )] ≥ p ◮ If ( G , x ) / ∈ L then, for every identity assignment Id, Pr [ out A ( G , x , Id , v ) = “no” for at least one node v ∈ V ( G )] ≥ q 12 / 33

  21. Example: AMOM 13 / 33

  22. Example: AMOM Randomized algorithm ◮ every unmarked node says “yes” with probability 1; ◮ every marked node says “yes” with probability p . 13 / 33

  23. Example: AMOM Randomized algorithm ◮ every unmarked node says “yes” with probability 1; ◮ every marked node says “yes” with probability p . Remarks: ◮ Runs in zero time; ◮ If the configuration has at most one marked node then correct with probability at least p . ◮ If there are at least k ≥ 2 marked nodes, correct with probability at least 1 − p k ≥ 1 − p 2 ◮ Thus there exists a ( p , q ) -decider for q + p 2 ≤ 1. 13 / 33

  24. Bounded-probability error local decision Definition BPLD ( t , p , q ) is the class of all distributed languages that have a randomized distributed ( p , q ) -decider running in time at most t . I.e., can be decided in time at most t by a randomized distributed algorithm with “yes” success probability p and “no” success probability q . 14 / 33

  25. Bounded-probability error local decision Definition BPLD ( t , p , q ) is the class of all distributed languages that have a randomized distributed ( p , q ) -decider running in time at most t . I.e., can be decided in time at most t by a randomized distributed algorithm with “yes” success probability p and “no” success probability q . Remark For p and q such that p 2 + q ≤ 1, there exists a language L ∈ BPLD ( 0 , p , q ) , such that L / ∈ LD ( t ) , for any t = o ( n ) . 14 / 33

  26. A sharp threshold for hereditary languages A prefix of a configuration ( G , x ) is a configuration ( G [ U ] , x [ U ]) , where U ⊆ V ( G ) Hereditary languages A language L is hereditary if every prefix of every configuration ( G , x ) ∈ L is also in L . ◮ Coloring and AMOM are hereditary languages. ◮ Every language { ( G , ǫ ) | G ∈ G} where G is hereditary is... hereditary. (Examples of hereditary graph families are planar graphs, interval graphs, forests, chordal graphs, cographs, perfect graphs, etc.) 15 / 33

  27. A sharp threshold for hereditary languages A prefix of a configuration ( G , x ) is a configuration ( G [ U ] , x [ U ]) , where U ⊆ V ( G ) Hereditary languages A language L is hereditary if every prefix of every configuration ( G , x ) ∈ L is also in L . ◮ Coloring and AMOM are hereditary languages. ◮ Every language { ( G , ǫ ) | G ∈ G} where G is hereditary is... hereditary. (Examples of hereditary graph families are planar graphs, interval graphs, forests, chordal graphs, cographs, perfect graphs, etc.) Theorem Let L be an hereditary language and let t be a function of triples ( G , x , Id ) . If L ∈ BPLD ( t , p , q ) for constants p , q ∈ ( 0 , 1 ] such that p 2 + q > 1 , then L ∈ LD ( O ( t )) . 15 / 33

  28. One ingredient in the proof Let 0 < δ < p 2 + q − 1, and define λ = 11 · ⌈ log p / log ( 1 − δ ) ⌉ . Separating partition A separating partition of ( G , x , Id ) is a triplet ( S , U 1 , U 2 ) of pairwise disjoint subsets of nodes such that S ∪ U 1 ∪ U 2 = V , and dist G ( U 1 , U 2 ) ≥ λ · t . U S U 2 1 16 / 33

  29. Glueing lemma Given a separating partition ( S , U 1 , U 2 ) of ( G , x , Id ) , let G k = G [ U k ∪ S ] , and let x k be the input x restricted to nodes in G k , for k = 1 , 2. S S U U 2 1 Lemma ( ⋆ ) For every instance ( G , x ) with identity assignment Id, and every separating partition ( S , U 1 , U 2 ) of ( G , x , Id ) , we have: � � ( G 1 , x 1 ) ∈ L and ( G 2 , x 2 ) ∈ L ⇒ ( G , x ) ∈ L . Remark. Lemma ( ⋆ ) does not use the fact that L is hereditary, but uses p 2 + q > 1. 17 / 33

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend