algorithms for big data ix
play

Algorithms for Big Data (IX) Chihao Zhang Shanghai Jiao Tong - PowerPoint PPT Presentation

Algorithms for Big Data (IX) Chihao Zhang Shanghai Jiao Tong University Nov. 15, 2019 Algorithms for Big Data (IX) 1/15 , the complexity is measured by bits communicated between them. We focus on the special one-way communication model.


  1. Algorithms for Big Data (IX) Chihao Zhang Shanghai Jiao Tong University Nov. 15, 2019 Algorithms for Big Data (IX) 1/15

  2. , the complexity is measured by bits communicated between them. We focus on the special one-way communication model. Review Last week, we saw the communication model. In this model, Alice and Bob collaborate to compute some function . Alice has and Bob Algorithms for Big Data (IX) 2/15

  3. , the complexity is measured by bits communicated between them. We focus on the special one-way communication model. Review Last week, we saw the communication model. In this model, Alice and Bob collaborate to compute some function . Alice has and Bob Algorithms for Big Data (IX) 2/15

  4. , the complexity is measured by bits communicated between them. We focus on the special one-way communication model. Review Last week, we saw the communication model. Alice has and Bob Algorithms for Big Data (IX) 2/15 In this model, Alice and Bob collaborate to compute some function f ( x, y ) .

  5. Review Last week, we saw the communication model. We focus on the special one-way communication model. Algorithms for Big Data (IX) 2/15 In this model, Alice and Bob collaborate to compute some function f ( x, y ) . Alice has x and Bob y , the complexity is measured by bits communicated between them.

  6. Review Last week, we saw the communication model. Algorithms for Big Data (IX) 2/15 In this model, Alice and Bob collaborate to compute some function f ( x, y ) . Alice has x and Bob y , the complexity is measured by bits communicated between them. We focus on the special one-way communication model.

  7. In this course, we consider protocols using public coins. Our lower bound applies to protocols using public coins, and hence also applies to ones using private coins. Randomness We allow randomness in our communication protocol. In this model, we assume there exits some random source in the environment so that both Alice and Bob can see it. Algorithms for Big Data (IX) 3/15

  8. In this course, we consider protocols using public coins. Our lower bound applies to protocols using public coins, and hence also applies to ones using private coins. Randomness We allow randomness in our communication protocol. In this model, we assume there exits some random source in the environment so that both Alice and Bob can see it. Algorithms for Big Data (IX) 3/15

  9. using private coins. Randomness We allow randomness in our communication protocol. In this model, we assume there exits some random source in the environment so that both Alice and Bob can see it. Our lower bound applies to protocols using public coins, and hence also applies to ones Algorithms for Big Data (IX) 3/15 In this course, we consider protocols using public coins.

  10. Our lower bound applies to protocols using public coins, and hence also applies to ones using private coins. Randomness We allow randomness in our communication protocol. In this model, we assume there exits some random source in the environment so that both Alice and Bob can see it. Algorithms for Big Data (IX) 3/15 In this course, we consider protocols using public coins.

  11. Randomness We allow randomness in our communication protocol. In this model, we assume there exits some random source in the environment so that both Alice and Bob can see it. Algorithms for Big Data (IX) 3/15 In this course, we consider protocols using public coins. Our lower bound applies to protocols using public coins, and hence also applies to ones using private coins.

  12. Problems log Algorithms for Big Data (IX) bits of communication. log We showed that any randomized protocol requires . 1 DISJ x y bits of communication. There exits a random protocol using For x bits of one-way communication. requires at needs We showed by a counting argument that any deterministic protocol to compute EQ y 1 x EQ x y . , , sometimes we view it as the indicator vector of some subset of 4/15

  13. Problems bits of communication. Algorithms for Big Data (IX) bits of communication. log We showed that any randomized protocol requires . 1 DISJ x y log There exits a random protocol using bits of one-way communication. requires at needs We showed by a counting argument that any deterministic protocol to compute EQ y 1 x EQ x y 4/15 For x ∈ { 0, 1 } n , sometimes we view it as the indicator vector of some subset of [ n ] , S ( x ) = { i ∈ [ n ] : x i = 1 } .

  14. Problems DISJ x y Algorithms for Big Data (IX) bits of communication. log We showed that any randomized protocol requires . 1 bits of communication. log There exits a random protocol using bits of one-way communication. requires at needs We showed by a counting argument that any deterministic protocol to compute EQ 4/15 For x ∈ { 0, 1 } n , sometimes we view it as the indicator vector of some subset of [ n ] , S ( x ) = { i ∈ [ n ] : x i = 1 } . EQ ( x , y ) ≜ 1 [ x = y ]

  15. Problems DISJ x y Algorithms for Big Data (IX) bits of communication. log We showed that any randomized protocol requires . 1 bits of communication. log There exits a random protocol using 4/15 For x ∈ { 0, 1 } n , sometimes we view it as the indicator vector of some subset of [ n ] , S ( x ) = { i ∈ [ n ] : x i = 1 } . EQ ( x , y ) ≜ 1 [ x = y ] ▶ We showed by a counting argument that any deterministic protocol to compute EQ requires at needs n bits of one-way communication.

  16. Problems DISJ x y 1 . We showed that any randomized protocol requires log bits of communication. Algorithms for Big Data (IX) 4/15 For x ∈ { 0, 1 } n , sometimes we view it as the indicator vector of some subset of [ n ] , S ( x ) = { i ∈ [ n ] : x i = 1 } . EQ ( x , y ) ≜ 1 [ x = y ] ▶ We showed by a counting argument that any deterministic protocol to compute EQ requires at needs n bits of one-way communication. ▶ There exits a random protocol using O ( log n ) bits of communication.

  17. We showed that any randomized protocol requires Problems log bits of communication. Algorithms for Big Data (IX) 4/15 For x ∈ { 0, 1 } n , sometimes we view it as the indicator vector of some subset of [ n ] , S ( x ) = { i ∈ [ n ] : x i = 1 } . EQ ( x , y ) ≜ 1 [ x = y ] ▶ We showed by a counting argument that any deterministic protocol to compute EQ requires at needs n bits of one-way communication. ▶ There exits a random protocol using O ( log n ) bits of communication. DISJ ( x , y ) ≜ 1 [ S ( x ) ∩ S ( y ) = ∅ ] .

  18. Problems Algorithms for Big Data (IX) 4/15 For x ∈ { 0, 1 } n , sometimes we view it as the indicator vector of some subset of [ n ] , S ( x ) = { i ∈ [ n ] : x i = 1 } . EQ ( x , y ) ≜ 1 [ x = y ] ▶ We showed by a counting argument that any deterministic protocol to compute EQ requires at needs n bits of one-way communication. ▶ There exits a random protocol using O ( log n ) bits of communication. DISJ ( x , y ) ≜ 1 [ S ( x ) ∩ S ( y ) = ∅ ] . ▶ We showed that any randomized protocol requires Ω ( log n ) bits of communication.

  19. bits, then any randomized one-way protocol with error at most input also costs at least bits” of a randomized protocol applies to the worst input with worst random bits. Yao’s Lemma Algorithms for Big Data (IX) We remark that “costs at least bits one-way communication. on any costs at least The main tool to prove lower bounds for randomized protocol is Yao’s lemma. is wrong on Pr with one-way communication protocol such that any deterministic over If there exists some distribution Lemma 5/15

  20. bits” of a randomized protocol applies to the worst input with worst random bits. Yao’s Lemma The main tool to prove lower bounds for randomized protocol is Yao’s lemma. Lemma We remark that “costs at least Algorithms for Big Data (IX) 5/15 If there exists some distribution D over { 0, 1 } a × { 0, 1 } b such that any deterministic one-way communication protocol P with Pr ( x,y ) ∼ D [ P is wrong on ( x, y )] ≤ ε costs at least k bits, then any randomized one-way protocol with error at most ε on any input also costs at least k bits one-way communication.

  21. Yao’s Lemma The main tool to prove lower bounds for randomized protocol is Yao’s lemma. Lemma input with worst random bits. Algorithms for Big Data (IX) 5/15 If there exists some distribution D over { 0, 1 } a × { 0, 1 } b such that any deterministic one-way communication protocol P with Pr ( x,y ) ∼ D [ P is wrong on ( x, y )] ≤ ε costs at least k bits, then any randomized one-way protocol with error at most ε on any input also costs at least k bits one-way communication. We remark that “costs at least k bits” of a randomized protocol applies to the worst

  22. We remark that the converse of Yao’s lemma is also correct. Proof of Yao’s Lemma Thus, by the assumption, there exists a distribution Algorithms for Big Data (IX) is wrong on Pr , This is sufgicient to imply that for some is wrong on Pr such that bits. Let costs less than By the definition of the costs of a random protocol, each . deterministic protocols as a distribution over can view We can first toss all coins and then run the protocol based on the results. Therefore, we bits in the worst case. be a randomized protocol costs less than 6/15

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend