communication complexity
play

Communication Complexity BASICS Summer School 2015 - PowerPoint PPT Presentation

Communication Complexity BASICS Summer School 2015 Communication Complexity of Relations Direct Sum Lower Bounds for Disjointness Asymmetric Communication Complexity and Data Structures d H ( x, y ) 0


  1. disjoint X,Y ⊆ {0,1} n Theorem : R = { ( x, y, i ) | x 2 X, y 2 Y, x i 6 = y i } C = { ( x, y ) | x ∈ X, y ∈ Y, d H ( x, y ) = 1 } | C | 2 partition# of R ≥ | X || Y | | C | 2 R cannot be partitioned into monochromatic rectangles < | X || Y | D ( R ) = Ω (2 log | C | − log | X | − log | Y | ) X : all x ∈ {0,1} n with parity 1 for R ⊕ Y : all y ∈ {0,1} n with parity 0 | X | = | Y | = 2 n − 1 | C | = n 2 n − 1 D ( R ⊕ ) = Ω (log n )

  2. disjoint X,Y ⊆ {0,1} n Theorem : R = { ( x, y, i ) | x 2 X, y 2 Y, x i 6 = y i } C = { ( x, y ) | x ∈ X, y ∈ Y, d H ( x, y ) = 1 } | C | 2 partition# of R ≥ | X || Y | R 1 , R 2 , ..., R t : optimal partition of R into monochromatic rectangles t t let then m i = | R i ∩ C | X X | X || Y | = | R i | | C | = m i i =1 i =1 in any monochromatic rectangle: i C C ( x , y ) ∈ C can only appear in C C distinct rows and columns | R i | ≥ m 2 i

  3. disjoint X,Y ⊆ {0,1} n Theorem : R = { ( x, y, i ) | x 2 X, y 2 Y, x i 6 = y i } C = { ( x, y ) | x ∈ X, y ∈ Y, d H ( x, y ) = 1 } | C | 2 partition# of R ≥ | X || Y | R 1 , R 2 , ..., R t : optimal partition of R into monochromatic rectangles let then m i = | R i ∩ C | t t X X | R i | ≥ m 2 | X || Y | = | R i | | C | = m i i i =1 i =1 ! 2 t t t X X X | C | 2 = m 2 | R i | = t | X || Y | m i i ≤ t ≤ t i =1 i =1 i =1 | C | 2 (Cauchy-Schwarz) t ≥ | X || Y |

  4. R ✏ + � ( R ) = R Pub ( R ) + O (log n + log δ − 1 ) ✏ transform any public-coin protocol P to P’ which uses only O(log n +log (1/ δ )) public random bits y ∈ { 0 , 1 } n x ∈ { 0 , 1 } n public random bits r ∼ Σ (of any length) ( 1 if P is wrong on inputs x, y and random bits r Z ( x, y, r ) = 0 otherwise ∀ legal x, y, E r ∼ Σ [ Z ( x, y, r )] ≤ ✏ Goal: ∃ r 1 , r 2 , ..., r t such that for uniform i ∈ [ n ] ∀ legal x, y, E i [ Z ( x, y, r i )] ≤ ✏ + � i is new random bits, { r 1 , r 2 , ..., r t } is hard-wired into protocol P’

  5. R ✏ + � ( R ) = R Pub ( R ) + O (log n + log δ − 1 ) ✏ ( 1 if P is wrong on inputs x, y and random bits r Z ( x, y, r ) = 0 otherwise ∀ legal x, y, E r ∼ Σ [ Z ( x, y, r )] ≤ ✏ Goal: ∃ r 1 , r 2 , ..., r t such that for uniform i ∈ [ n ] ∀ legal x, y, E i [ Z ( x, y, r i )] ≤ ✏ + � sample r 1 , r 2 , ..., r t i.i.d according to ∑ t E i [ Z ( x, y, r i )] = 1 ∀ particular legal x , y, X Z ( x, y, r i ) t i =1 Chernoff " # t ≤ e − 2 δ 2 t X r 1 ,...,r t [ E i [ Z ( x, y, r i )] > ✏ + � ] = Pr Pr Z ( x, y, r i ) > ( ✏ + � ) t bound: r 1 ,...,r t i =1 choose t=O( n / δ 2 ) < 2 − 2 n union bound: r 1 ,...,r t [ ∃ x, y, E i [ Z ( x, y, r i )] > ✏ + � ] < 1 Pr r 1 ,...,r t [ ∀ x, y, E i [ Z ( x, y, r i )] > ✏ + � ] > 0 Pr

  6. R ✏ + � ( R ) = R Pub ( R ) + O (log n + log δ − 1 ) ✏ transform any public-coin protocol P to P’ which uses only O(log n +log δ -1 ) public random bits y ∈ { 0 , 1 } n x ∈ { 0 , 1 } n public random bits r ∼ Σ (of any length) find such random bits r 1 , r 2 , ..., r t , t=O( n / δ 2 ) : ∀ legal inputs x , y Pr i [ P is wrong on x, y with random bits r i ] ≤ ✏ + � Alice and Bob know { r 1 , r 2 , ..., r t } without communication P’ : run P ( x,y,r i ) where uniform i is new public random bits

  7. FORK Relation i : x i = y i FORK ⊂ Σ ` × Σ ` × { 1 , . . . , ` − 1 } x i +1 6 = y i +1 y ∈ Σ ` x ∈ Σ ` alphabet Σ ={1,2, ..., w } output : such an index i that x i = y i and x i +1 ≠ y i +1

  8. FORK Relation i : x i = y i FORK ⊂ Σ ` × Σ ` × { 0 , 1 , . . . , ` } x i +1 6 = y i +1 1 1 2 1 = = = = x 0 x 1 · · · x ` x ` +1 y 0 y 1 · · · y ` y ` +1 x 1 x 2 · · · x ` ∈ Σ ` y 1 y 2 · · · y ` ∈ Σ ` alphabet Σ ={1,2, ..., w } output : such an index i that x i = y i and x i +1 ≠ y i +1 output 0 if x = y and l if x ≠ y entry-wise Alice: 1 1 2 3 1 2 1 3 w =3 Bob: l =6 2 1 3 2 1 2 2 3 correct answers i = 0 4 6

  9. FORK Relation i : x i = y i FORK ⊂ Σ ` × Σ ` × { 0 , 1 , . . . , ` } x i +1 6 = y i +1 1 1 2 1 = = = = x 0 x 1 · · · x ` x ` +1 y 0 y 1 · · · y ` y ` +1 x 1 x 2 · · · x ` ∈ Σ ` y 1 y 2 · · · y ` ∈ Σ ` alphabet Σ ={1,2, ..., w } How? D (FORK) = O (log ` log w ) binary search to maintain an ( i , j ) such that i < j , x i = y i and x j ≠ y j starting with i =0, j = l by exchanging a character in Σ in each round

  10. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l a protocol for FORK is a (1, l ) -protocol Lemma : ∃ c -bit ( α , l ) -protocol for FORK ∃ (c-1) -bit ( α /2, l ) -protocol for FORK P : successfully solves FORK for ∀ x,y ∈ S with | S | ≥α w l WLOG : Alice sends the 1 st bit a ∈ {0,1} choose a larger S a = { x ∈ S | Alice sends a } run P without Alice sending the 1 st bit correct for (under the assumption that Alice sent a ) ∀ x,y ∈ S a

  11. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l a protocol for FORK is a (1, l ) -protocol Lemma : ∃ c -bit ( α , l ) -protocol for FORK ∃ (c-1) -bit ( α /2, l ) -protocol for FORK D (FORK) = Ω (log w ) Why not bigger? How? the subproblem should be nontrivial α < 1/ w may trivialize the problem

  12. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l a protocol for FORK is a (1, l ) -protocol Lemma : ∃ c -bit ( α , l ) -protocol for FORK ∃ (c-1) -bit ( α /2, l ) -protocol for FORK Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) D (FORK) = Ω (log ` log w )

  13. a protocol for FORK is a (1, l ) -protocol then it must also be a (1/ w 1/3 , l ) -protocol Lemma : ∃ c -bit ( α , l ) -protocol for FORK ∃ (c-1) -bit ( α /2, l ) -protocol for FORK � 4 � ∃ ( c − Ω (log w ))-bit -protocol w 2 / 3 , ` Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) w 1 / 3 , ` � 1 � ∃ ( c − Ω(log w ))-bit -protocol 2 repeat for O(log l ) times c > Ω (log ` log w ) � 1 � ∃ ( c − Ω (log ` log w ))-bit w 1 / 3 , 2 -protocol

  14. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l Lemma : ∃ c -bit ( α , l ) -protocol for FORK ∃ (c-1) -bit ( α /2, l ) -protocol for FORK Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) [Gringi, Sipser ’91] D (FORK) = Ω (log ` log w )

  15. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) P P’ P : solve inputs from S ⊆ Σ l P’ : use protocol P to solve inputs from a denser S’ ⊆ Σ l/2 y ∈ S 0 ⊆ Σ `/ 2 x ∈ S 0 ⊆ Σ ` / 2 f ( x ) ∈ S ⊆ Σ ` g ( y ) ∈ S ⊆ Σ ` FORK( f ( x ), g ( y )) answers FORK( x , y ) i that f ( x ) i = g ( y ) i , f ( x ) i +1 ≠ g ( y ) i +1 tells us j that x j = y j , x j +1 ≠ y j +1

  16. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) P P’ P : solve inputs from S ⊆ Σ l P’ : use protocol P to solve inputs from a denser S’ ⊆ Σ l/2 f ( x ) , g ( y ) ∈ S ( = = = = ( ( x, y ∈ S 0 extension u ∃ u ∈ Σ l/2 : many elements z ∈ S is in form z = ( u,x )

  17. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) P P’ P : solve inputs from S ⊆ Σ l P’ : use protocol P to solve inputs from a denser S’ ⊆ Σ l/2 f ( x ) , g ( y ) ∈ S ( ≠ ≠ ≠ ≠ ( ( x, y ∈ S 0 extension ∃ large S’ ⊆ Σ l/2 : any x,y ∈ S’ can be extended to ( x , x ’), ( y , y ’) ∈ S ( x , F ( x )), ( y , G ( y )) ∈ S such that are entry-wise different x’, y’ F ( x ), G ( y )

  18. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) S ⊆ Σ ` and | S | ≥ α w ` ∃ u ∈ Σ l/2 : many elements z ∈ S is in form z = ( u,x ) ⇢ or ∃ large S’ ⊆ Σ l/2 : any x,y ∈ S’ can be extended to ( x , F ( x )), ( y , G ( y )) ∈ S such that are entry-wise different F ( x ), G ( y ) √ α ` “ many ” = “ large ” = 2 w 2

  19. S ⊆ Σ ` and : √ α ` | S | ≥ α w ` “ many ” = “ large ” = 2 w 2 ∃ u ∈ Σ l/2 : many elements z ∈ S is in form z = ( u,x ) ⇢ or ∃ large S’ ⊆ Σ l/2 : any x,y ∈ S’ can be extended to ( x , F ( x )), ( y , G ( y )) ∈ S such that are entry-wise different F ( x ), G ( y ) Boolean matrix S : Σ ` / 2 ( 1 if ( u, v ) ∈ S ∀ u, v ∈ Σ `/ 2 , S ( u, v ) = 0 otherwise S is α - dense (of 1 -entries) ≥ p α ∃ a row u that is - dense ⇢ 2 or p α ∃ - fraction of rows - dense ≥ α 2 Σ ` / 2 2 “Either one row is very dense , or there are many rows that are pretty dense .” By contradiction: < p ↵ < p α rows are -dense all rows are -dense and 2 w ` / 2 ≥ α 2 2 2 + p α p α contradiction! density of S < α 2 = α 2

  20. S ⊆ Σ ` and : √ α ` | S | ≥ α w ` “ many ” = “ large ” = 2 w 2 ∃ u ∈ Σ l/2 : many elements z ∈ S is in form z = ( u,x ) ⇢ or ∃ large S’ ⊆ Σ l/2 : any x,y ∈ S’ can be extended to ( x , F ( x )), ( y , G ( y )) ∈ S such that are entry-wise different F ( x ), G ( y ) Boolean matrix S : Σ ` / 2 ( 1 if ( u, v ) ∈ S ∀ u, v ∈ Σ `/ 2 , S ( u, v ) = 0 otherwise S is α - dense (of 1 -entries) ≥ p α ∃ a row u that is - dense ⇢ 2 or p α ∃ - fraction of rows - dense ≥ α 2 Σ ` / 2 2 |{ ( u, x ) ∈ S }| ≥ p ↵ we still need ` ∃ u ∈ Σ ` / 2 : ⇢ 2 w 2 or ∃ ≥ p ↵ 2 w ` / 2 many x ∈ Σ ` / 2 : ` |{ ( x, u ) ∈ S }| ≥ ↵ 2 w 2

  21. ∃ ≥ p ↵ , S ⊆ Σ ` 2 w ` / 2 many x ∈ Σ ` / 2 : ` |{ ( x, u ) ∈ S }| ≥ ↵ 2 w 2 p ↵ 2 w ` / 2 such that: ∃ S’ ⊆ Σ l/2 of size | S 0 | ≥ any x,y ∈ S’ can be extended to ( x , F ( x )), ( y , G ( y )) ∈ S such that F ( x ), G ( y ) are entry-wise different Goal: nonempty subsets: find F 1 , F 2 , . . . , F ` / 2 ⊂ Σ and their compliments: F 1 , F 2 , . . . , F `/ 2 ⊂ Σ 2 w ` / 2 many x ∈ Σ l/2 √ ↵ such that for ≥ such that ( x, u ) ∈ S ∃ u ∈ F 1 × · · · × F `/ 2 ( F ( x )= u ) and such that ( x, v ) ∈ S ( G ( x )= v ) ∃ v ∈ F 1 × · · · × F `/ 2 any u ∈ F 1 × · · · × F `/ 2 and any v ∈ F 1 × · · · × F `/ 2 must be entry-wise different: 8 1  i  ` u i 6 = v i 2 ,

  22. ∃ ≥ p ↵ , S ⊆ Σ ` 2 w ` / 2 many x ∈ Σ ` / 2 : ` |{ ( x, u ) ∈ S }| ≥ ↵ 2 w 2 p ↵ 2 w ` / 2 such that: ∃ S’ ⊆ Σ l/2 of size | S 0 | ≥ any x,y ∈ S’ can be extended to ( x , F ( x )), ( y , G ( y )) ∈ S such that F ( x ), G ( y ) are entry-wise different independently random: F 1 , F 2 , . . . , F ` / 2 ⊂ Σ and their compliments: F 1 , F 2 , . . . , F `/ 2 ⊂ Σ ✓ Σ ◆ is sampled uniformly and each F i ∈ independently at random w/ 2 ` for any “good” x that |{ ( x, u ) ∈ S }| ≥ α 2 w 2 ] >? Pr [ such that ( x, u ) ∈ S ∃ u ∈ F 1 × · · · × F `/ 2 and such that ( x, v ) ∈ S ∃ v ∈ F 1 × · · · × F `/ 2

  23. ∃ ≥ p ↵ , S ⊆ Σ ` 2 w ` / 2 many x ∈ Σ ` / 2 : ` |{ ( x, u ) ∈ S }| ≥ ↵ 2 w 2 p ↵ 2 w ` / 2 such that: ∃ S’ ⊆ Σ l/2 of size | S 0 | ≥ any x,y ∈ S’ can be extended to ( x , F ( x )), ( y , G ( y )) ∈ S such that F ( x ), G ( y ) are entry-wise different independently random: F 1 , F 2 , . . . , F ` / 2 ⊂ Σ and their compliments: F 1 , F 2 , . . . , F `/ 2 ⊂ Σ ✓ Σ ◆ is sampled uniformly and each F i ∈ independently at random w/ 2 ` for any “good” x that |{ ( x, u ) ∈ S }| ≥ α 2 w 2 ] >? Pr [ x is “ really good ”

  24. ∃ ≥ p ↵ , S ⊆ Σ ` 2 w ` / 2 many x ∈ Σ ` / 2 : ` |{ ( x, u ) ∈ S }| ≥ ↵ 2 w 2 p ↵ 2 w ` / 2 such that: ∃ S’ ⊆ Σ l/2 of size | S 0 | ≥ any x,y ∈ S’ can be extended to ( x , F ( x )), ( y , G ( y )) ∈ S such that F ( x ), G ( y ) are entry-wise different independently random: F 1 , F 2 , . . . , F ` / 2 ⊂ Σ and their compliments: F 1 , F 2 , . . . , F `/ 2 ⊂ Σ ✓ Σ ◆ is sampled uniformly and each F i ∈ independently at random w/ 2 ` for any “good” x that |{ ( x, u ) ∈ S }| ≥ α 2 w 2 Why? Pr[ 8 u 2 F 1 ⇥ · · · ⇥ F `/ 2 , ( x, u ) 62 S ] ⌘ w + 2 < 2e − α w/ 4 ⇣ 1 − α < 2 2 Pr[ 8 v 2 F 1 ⇥ · · · ⇥ F `/ 2 , ( x, v ) 62 S ] x is “really good”: ∃ u ∈ F 1 × · · · × F `/ 2 , ( x, u ) ∈ S and ∃ v ∈ F 1 × · · · × F `/ 2 , ( x, v ) ∈ S

  25. ∃ ≥ p ↵ , S ⊆ Σ ` 2 w ` / 2 many x ∈ Σ ` / 2 : ` |{ ( x, u ) ∈ S }| ≥ ↵ 2 w 2 p ↵ 2 w ` / 2 such that: ∃ S’ ⊆ Σ l/2 of size | S 0 | ≥ any x,y ∈ S’ can be extended to ( x , F ( x )), ( y , G ( y )) ∈ S such that F ( x ), G ( y ) are entry-wise different independently random: F 1 , F 2 , . . . , F ` / 2 ⊂ Σ and their compliments: F 1 , F 2 , . . . , F `/ 2 ⊂ Σ ✓ Σ ◆ is sampled uniformly and each F i ∈ independently at random w/ 2 ` for any “good” x that |{ ( x, u ) ∈ S }| ≥ α 2 w 2 Pr[ x is really good ] > 1 − 2e − α w/ 4 E [# of really good x ] ≥ (1 − 2e αw/ 4 ) p α √ α (for ) α > 100 ≥ 2 2 w x is “really good”: ∃ u ∈ F 1 × · · · × F `/ 2 , ( x, u ) ∈ S and ∃ v ∈ F 1 × · · · × F `/ 2 , ( x, v ) ∈ S

  26. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) P P’ P : solve inputs from S ⊆ Σ l P’ : use protocol P to solve inputs from a denser S’ ⊆ Σ l/2 ∈ S ∈ S ( ( ≠ ≠ ≠ ≠ = = = = ( ( ( ( x, y ∈ S 0 extension F ( x ) , G ( y ) x, y ∈ S 0 extension u

  27. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) P P’ P : solve inputs from S ⊆ Σ l P’ : use protocol P to solve inputs from a denser S’ ⊆ Σ l/2 either: y ∈ S 0 ⊆ Σ `/ 2 x ∈ S 0 ⊆ Σ ` / 2 ( u, x ) ∈ S ( u, y ) ∈ S or: x ∈ S 0 ⊆ Σ ` / 2 y ∈ S 0 ⊆ Σ `/ 2 ( x, F ( x )) ∈ S ( y, G ( y )) ∈ S F ( x ), G ( y ) are entry-wise different

  28. FORK : | Σ |= w, for ∀ x, y ∈ Σ l , find i that x i = y i and x i +1 ≠ y i +1 ( α , l ) -protocol: successfully solves FORK for ∀ x,y ∈ S for an S ⊆ Σ l of size at least | S | ≥α w l Lemma : ∃ c -bit ( α , l ) -protocol for FORK ∃ (c-1) -bit ( α /2, l ) -protocol for FORK Amplification Lemma : for FORK, for α > 100 w √ ↵ ∃ c -bit ( α , l ) -protocol ∃ c -bit -protocol 2 , ` ( 2 ) [Gringi, Spser ’91] D (FORK) = Ω (log ` log w )

  29. Direct Sum • Direct product: The probability of success of performing k independent tasks decreases in k . • Yao’s XOR lemma, the parallel repetition theorem of Ran Raz ... • Direct sum: The amount of resources needed to perform k independent tasks grows with k . • direct sum problems in CC

  30. Direct Sum Settings f : X f × Y f → { 0 , 1 } g : X g × Y g → { 0 , 1 } f ( x f , y f ) g ( x g , y g ) x f ∈ X f y f ∈ Y f x g ∈ X g y g ∈ Y g ( X F = X f × X g F : X F × Y F → { 0 , 1 } 2 with Y F = Y f × Y g F (( x f , x g ) , ( y f , y g )) = ( f ( x f , y f ) , g ( x g , y g )) subproblems are independent : inputs are arbitrary over ∀ (( x f , x g ) , ( y f , y g )) ∈ ( X f × X g ) × ( Y f × Y g ) over over µ F = µ f × µ g µ f µ g X g × Y g X f × Y f

  31. Direct Sum Settings f : X f × Y f → { 0 , 1 } g : X g × Y g → { 0 , 1 } f ( x f , y f ) g ( x g , y g ) x f ∈ X f y f ∈ Y f x g ∈ X g y g ∈ Y g ( X F = X f × X g F : X F × Y F → { 0 , 1 } 2 with Y F = Y f × Y g F (( x f , x g ) , ( y f , y g )) = ( f ( x f , y f ) , g ( x g , y g )) communication complexity: CC ( f, g ) , CC ( F ) for deterministic, randomized, nondeterministic protocols...

  32. Direct Sum Settings f : X f × Y f → { 0 , 1 } g : X g × Y g → { 0 , 1 } f ( x f , y f ) ∧ g ( x g , y g ) x f ∈ X f y f ∈ Y f x g ∈ X g y g ∈ Y g ( X F = X f × X g with F : X F × Y F → { 0 , 1 } Y F = Y f × Y g F (( x f , x g ) , ( y f , y g )) = f ( x f , y f ) ∧ g ( x g , y g ) communication complexity: CC ( f ∧ g ) , CC ( F ) for deterministic, randomized, nondeterministic protocols...

  33. Direct Sum Settings f : X × Y → { 0 , 1 } f ( x 1 , y 1 ) . . . f ( x k , y k ) y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ f k : X k × Y k → { 0 , 1 } k f k ( ~ y ) = ( f ( x 1 , y 1 ) , . . . , f ( x k , y k )) x, ~ communication complexity: CC ( f k )

  34. Direct Sum Problems • Question I: Can CC ( f k ) ≪ k · CC ( f ) ? • Question II: Can CC ( ⋀ k f ) ≪ k · CC ( f ) ? • “Can we solve several problems simultaneously in a way that is substantially better than to solve each of the problems separately?” • Answer(?) to QI: possibly “no” for all functions. • Contemporary tool: Information Complexity

  35. Randomized Protocols f ( x 1 , y 1 ) f : X × Y → { 0 , 1 } . . . f k : X k × Y k → { 0 , 1 } k f ( x k , y k ) y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ • Individually correct: each output ( x i , y i ) is correct with probability > 2/3. • Simultaneously correct: all output ( x i , y i ) are correct simultaneously with probability > 2/3. direct product (conjecture): The probability of simultaneous success is < (2/3) Ω ( k ) with any communication cost ≪ O( k · CC ( f )) . examples : parallel repetition theorem, Yao XOR lemma

  36. EQ : X × Y → { 0 , 1 } X = Y = { 0 , 1 } n y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ where z i indicates whether x i = y i EQ k ( ~ y ) = ~ x, ~ z R Pub (EQ) = O (1) by checking whether h x, r i = h y, r i where r is a shared random Boolean vector X ! and h x, r i := x ( i ) r ( i ) mod 2 i is the inner-product over GF(2)

  37. EQ : X × Y → { 0 , 1 } X = Y = { 0 , 1 } n y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ where z i indicates whether x i = y i EQ k ( ~ y ) = ~ x, ~ z R Pub (EQ) = O (1) recall: Theorem : R ( f ) = O ( R Pub ( f ) + log n )

  38. EQ : X × Y → { 0 , 1 } X = Y = { 0 , 1 } n y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ where z i indicates whether x i = y i EQ k ( ~ y ) = ~ x, ~ z R ( f ) = O ( R Pub ( f ) + log n ) R Pub (EQ) = O (1) repeat the protocol on every each instance: 1/3 k error instance ( x i , y i ) for O(log k ) times 1 Pr[ output ( x i , y i ) = 1 | x i 6 = y i ]  3 k union bound R Pub (EQ k ) = O ( k log k ) all k instances: 1/3 error R (EQ k ) = O ( k log k + log n ) y ]  1 Pr[ 9 i, output ( x i , y i ) = 1 | ~ x 6 = ~ 3

  39. EQ : X × Y → { 0 , 1 } X = Y = { 0 , 1 } n y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ where z i indicates whether x i = y i EQ k ( ~ y ) = ~ x, ~ z R (EQ k ) = O ( k log k + log n ) recall: Theorem : R (EQ) = Θ (log n ) consider k = log n : R (EQ k ) = O (log n log log n ) ⌧ k · R (EQ) = Θ((log n ) 2 )

  40. Randomized Protocols f ( x 1 , y 1 ) f : X × Y → { 0 , 1 } X = Y = { 0 , 1 } n . . . f k : X k × Y k → { 0 , 1 } k f ( x k , y k ) y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ Observations : R ( f k ) ≤ k · R ( f ) individually correct: simultaneously correct: R ( f k ) = O ( k log k · R ( f )) individual: apply the protocol independently on k instances simultaneous: repeat O(log k ) times for every instance individual error ≤ 1/3 k , then apply union bound

  41. Randomized Protocols f ( x 1 , y 1 ) f : X × Y → { 0 , 1 } X = Y = { 0 , 1 } n . . . f k : X k × Y k → { 0 , 1 } k f ( x k , y k ) y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ Observations : individually correct: R Pub ( f k ) ≤ k · R Pub ( f ) simultaneously correct: R Pub ( f k ) = O ( k log k · R Pub ( f )) recall: Theorem : R ( f ) = O ( R Pub ( f ) + log n )

  42. Randomized Protocols f ( x 1 , y 1 ) f : X × Y → { 0 , 1 } X = Y = { 0 , 1 } n . . . f k : X k × Y k → { 0 , 1 } k f ( x k , y k ) y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ R ( f k ) = O R Pub ( f k ) + log kn � � ( simultaneous k log k · R Pub ( f ) + log n � � ≤ O correctness ) when and R Pub ( f ) ⌧ log n R ( f ) = Ω (log n ) this gives an acceleration over for small k k · R ( f )

  43. Randomized Protocols f ( x 1 , y 1 ) f : X × Y → { 0 , 1 } X = Y = { 0 , 1 } n . . . f k : X k × Y k → { 0 , 1 } k f ( x k , y k ) y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ Observations : individually correct: R Pub ( f k ) ≤ k · R Pub ( f ) simultaneously correct: R Pub ( f k ) = O ( k log k · R Pub ( f ))

  44. List-Non-Equality problem: ^ LNE k,n ( ~ y ) = x, ~ x i 6 = y i X = Y = { 0 , 1 } n i y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ R Pub (LNE k,n ) =? R (LNE k,n ) =? 1st trial: run the inner-product protocol on every ( x i , y i ) each x i , ≠ y i is missed with probability 1/3 Pr[ miss one of x i 6 = y i ] = 1 � (2 / 3) k 2nd trial: run the protocol on every ( x i , y i ) for Θ (log k ) times every x i ≠ y i is missed with probability < 1/3 k cost = O( k log k ) 3ird trial: make every x i ≠ y i missed with probability < 1/3 k and every ( x i , y i ) repeated for O(1) times on average!

  45. List-Non-Equality problem: ^ LNE k,n ( ~ y ) = x, ~ x i 6 = y i X = Y = { 0 , 1 } n i y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ for i =1 to k repeat the IP protocol on ( x i , y i ) until detecting x i ≠ y i ; break and return 0 at any time if overall repetitions > C k ; return 1 ; communication complexity: O(C k ) always correct ∃ i, x i = y i ( C -1) k failures in Ck independent trials 8 i, x i 6 = y i each trial succeeds with prob. ≥ 1/2 Chernoff: C =3 , exponentially small probability

  46. List-Non-Equality problem: ^ LNE k,n ( ~ y ) = x, ~ x i 6 = y i X = Y = { 0 , 1 } n i y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ for i =1 to k repeat the IP protocol on ( x i , y i ) until detecting x i ≠ y i ; break and return 0 at any time if overall repetitions > 3 k ; return 1 ; communication complexity: O( k ) always correct ∃ i, x i = y i incorrect with exp(- Ω ( k )) prob. 8 i, x i 6 = y i R Pub (LNE k,n ) = O ( k ) R (LNE k,n ) = O ( k + log n )

  47. List-Non-Equality problem: ^ LNE k,n ( ~ y ) = x, ~ x i 6 = y i X = Y = { 0 , 1 } n i y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ Las Vegas: for i =1 to k repeat for ≤ t times the IP protocol on ( x i , y i ) until detecting x i ≠ y i ; if a ( x i , y i ) has been repeated for t times Alice sends Bob x i to see whether x i = y i and if so break and return 0; return 1 ; always correct if terminates the first costs O( t + n ) bits x i = y i 0 1 t A =O(1) each x i 6 = y i expectedly costs O j 2 − j + n 2 − t X @ j =1 when t = n

  48. List-Non-Equality problem: ^ LNE k,n ( ~ y ) = x, ~ x i 6 = y i X = Y = { 0 , 1 } n i y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ Las Vegas: for i =1 to k repeat for ≤ t times the IP protocol on ( x i , y i ) until detecting x i ≠ y i ; if a ( x i , y i ) has been repeated for n times Alice sends Bob x i to see whether x i = y i and if so break and return 0; return 1 ; always correct if terminates communication cost in expectation: O( k + n ) R Pub (LNE k,n ) = O ( k + n ) R 0 (LNE k,n ) = O ( k + n ) 0

  49. List-Non-Equality problem: ^ LNE k,n ( ~ y ) = x, ~ x i 6 = y i X = Y = { 0 , 1 } n i y = ( y 1 , . . . , y k ) ∈ Y k x = ( x 1 , . . . , x k ) ∈ X k ~ ~ LNE k,n = ∧ k EQ n Monte Carlo: R (LNE k,n ) = O ( k + log n ) Las Vegas: R 0 (LNE k,n ) = O ( k + n ) while: R (EQ) = Θ (log n ) R 0 (EQ) = Θ(log n )

  50. Nondeterministic Protocols N 1 ( f ) : complexity of optimally certifying positive instances of f μ is a probability distribution over 1 s of f : μ is a distribution over { ( x, y ) | f ( x, y ) = 1 } Definition The rectangle size bound of f is 1 B ∗ ( f ) := µ over 1s min max µ ( R ) R where R ranges over all 1 -monochromatic rectangles. Theorem log 2 B ∗ ( f ) ≤ N 1 ( f ) ≤ log 2 B ∗ ( f ) + log 2 n

  51. 1 B ∗ ( f ) := max min µ ( R ) µ over 1s R : 1-rect. Theorem log 2 B ∗ ( f ) ≤ N 1 ( f ) ≤ log 2 B ∗ ( f ) + log 2 n N 1 ( f ) = log 2 C 1 ( f ) C 1 ( f ) : #of monochromatic rectangles to cover 1 s of f optimal cover: C = { R 1 , R 2 , . . . , R C 1 ( f ) } for any distribution μ over 1 s of f : 1 X ≤ C 1 ( f ) max R ∈C µ ( R ) min µ ( R ) ≤ C 1 ( f ) 1 ≤ µ ( R ) R ∈C R ∈ C B ∗ ( f ) ≤ C 1 ( f ) the other direction: build up a rectangle cover greedily by always taking the largest rectangle in a uniform μ over remaining 1 s C 1 ( f ) ≤ O ( nB ∗ ( f ))

  52. 1 B ∗ ( f ) := max min µ ( R ) µ over 1s R : 1-rect. Theorem log 2 B ∗ ( f ) ≤ N 1 ( f ) ≤ log 2 B ∗ ( f ) + log 2 n B ∗ ( f ∧ g ) ≥ B ∗ ( f ) · B ∗ ( g ) N 1 ( ∧ k f ) ≥ log B ∗ ( ∧ k f ) ≥ k log B ∗ ( f ) ≥ k ( N 1 ( f ) − log n ) by symmetry: N 0 ( ∨ k f ) ≥ k ( N 0 ( f ) − log n ) N ( f k ) ≥ max( N 1 ( ∧ k f ) , N 0 ( ∨ k f )) ≥ k ( N ( f ) − log n ) complexity of optimal nondeterministic protocol for f N ( f ) :

  53. 1 B ∗ ( f ) := max min µ ( R ) µ over 1s R : 1-rect. B ∗ ( f ∧ g ) ≥ B ∗ ( f ) · B ∗ ( g ) suppose optimums are achieved by: 1 1 B ∗ ( f ) = min µ f ( R ) , B ∗ ( g ) = min µ g ( R ) R R 1 1 for all 1 -rectangles R B ∗ ( f ) ≤ µ f ( R ) , B ∗ ( g ) ≤ µ g ( R ) Goal: find a distribution μ over 1 s of f ∧ g such that ∀ 1 -rectangles R in f ∧ g , µ ( R ) ≤ µ f ( R f ) µ ( R g ) for some 1 -rectangles R f in f and R g in g 1 1 B ∗ ( f ∧ g ) ≥ µ ( R f ) µ ( R g ) ≥ B ∗ ( f ) · B ∗ ( g ) µ ( R ) ≥

  54. given μ f over 1 s of f, and μ g over 1 s of g Goal: find a distribution μ over 1 s of f ∧ g such that ∀ 1 -rectangles R in f ∧ g , µ ( R ) ≤ µ f ( R f ) µ ( R g ) for some 1 -rectangles R f in f and R g in g define μ over inputs of f ∧ g as: µ (( x f , x g ) , ( y f , y g )) = µ f ( x f , y f ) µ g ( x g , y g ) μ is a distribution over 1 s of f ∧ g ( R f = { ( x f , y f ) | (( x f , ∗ ) , ( y f , ∗ )) ∈ R } ∀ 1- rectangle R in f ∧ g , projections R g = { ( x g , y g ) | (( ∗ , x g ) , ( ∗ , y g )) ∈ R } (because of ∧ ) are 1 -rectangles in f and g R f × R g = { (( x f , x g ) , ( y f , y g )) | (( x f , y f ) ∈ R f , ( x g , y g ) ∈ R g } is a 1 -rectangle in f ∧ g and R ⊆ R f × R g µ ( R ) ≤ µ ( R f × R g ) ≤ µ ( R f ) · µ ( R g )

  55. 1 B ∗ ( f ) := max min µ ( R ) µ over 1s R : 1-rect. B ∗ ( f ∧ g ) ≥ B ∗ ( f ) · B ∗ ( g ) key property in the proof: given μ f over 1 s of f, and μ g over 1 s of g find a distribution μ over 1 s of f ∧ g such that ∀ 1 -rectangles R in f ∧ g , µ ( R ) ≤ µ f ( R f ) µ ( R g ) for some 1 -rectangles R f in f and R g in g consequence: N ( f k ) ≥ k ( N ( f ) − log n )

  56. Deterministic Protocols complexity of optimal deterministic protocol for f D ( f ) : CC D ( f k ) k · CC D ( f ) vs. Theorem: D ( f ) ≤ O ( N ( f ) 2 ) D ( f k ) ≥ N ( f k ) ≥ k ( N ( f ) − log n ) ⇣ ⇣p ⌘⌘ D ( f ) − log n ≥ Ω k

  57. rank ( f ∧ g ) = rank ( f ) rank ( g ) communication matrix: M f ∧ g = M f ⊗ M g Kronecker product   a 11 B a 1 n B · · · . . ... A ⊗ B =   . . . .   a m 1 B a mn B · · · A ⊗ B (( i, k ) , ( j, l )) = a ij b kl rank ( A ⊗ B ) = rank ( A ) rank ( B )

  58. rank ( f ∧ g ) = rank ( f ) rank ( g ) so ^ LNE k,n = ∧ k EQ LNE k,n ( ~ y ) = x, ~ x i 6 = y i i rank (LNE k,n ) = rank (EQ) k = (2 n ) k D (LNE k,n ) ≥ log rank (LNE k,n ) = kn = n 2 ( 1 -sided error with R (LNE k,n ) = O ( k + log n ) false negative) recall: R 0 (LNE k,n ) = O ( k + n ) =O( n ) N 1 (LNE k,n ) ≤ R (LNE k,n ) = O ( k + log n ) N 0 (LNE k,n ) ≤ O (log k + n ) (Alice sends ( i , x i ) with x i = y i to Bob) when k = n N (LNE k,n ) = O ( n )

  59. rank ( f ∧ g ) = rank ( f ) rank ( g ) there is a function (LNE) such that D ( f ) = Ω ( N 0 ( f ) N 1 ( f )) D ( f ) = Ω( R 0 ( f ) 2 ) (both achieve largest possible gaps)

  60. Disjointness DISJ : X × Y → { 0 , 1 } S ∩ T = ∅ ? S ⊆ [ n ] T ⊆ [ n ] X = Y = 2 [ n ] ( if S ∩ T = ∅ 1 DISJ( S, T ) = 0 otherwise

  61. Disjointness DISJ : X × Y → { 0 , 1 } n ^ NAND( x i , y i ) i x ∈ { 0 , 1 } n y ∈ { 0 , 1 } n X = Y = { 0 , 1 } n ( 1 ∀ i, x i y i = 0 DISJ( x, y ) = 0 otherwise n n ^ ^ DISJ( x, y ) = x i ∨ ¯ ¯ y i = NAND( x i , y i ) i =1 i

  62. D (DISJ) = Ω ( n ) by fooling set Theorem: [Kalyanasundaram, Schnitger’92] [Razborov’92] [Bar-Yossef, Jayram, Kumar, Sivakumar’02] R (DISJ) = Ω( n ) Theorem: [Babai, Frankl, Simon’02] The deterministic communication complexity on distributional inputs: D µ (DISJ) = O ( √ n log n ) for all product distributions μ .

  63. D (DISJ) = Ω ( n ) by fooling set Theorem: [Kalyanasundaram, Schnitger’92] [Razborov’92] [Bar-Yossef, Jayram, Kumar, Sivakumar’02] R (DISJ) = Ω( n ) idea: R (DISJ) = R ( ∧ n NAND) ≥ Ω ( n ) R (NAND)? [Bar-Yossef, Jayram, Kumar, Sivakumar’02] R (DISJ) ≥ IC µ (DISJ) = IC µ ( ∧ n NAND) ≥ Ω ( n ) IC µ (NAND)

  64. Information Theory entropy: 1 X H ( X ) = P ( x ) log P ( x ) x conditional entropy: X H ( X | Y ) = P ( y ) H ( X | Y = y ) y mutual information: I ( X ; Y ) = H ( X ) − H ( X | Y ) = H ( Y ) − H ( Y | X ) conditional mutual information: I ( X ; Y | Z ) = H ( X | Z ) − H ( X | Y Z ) = I ( X ; Y Z ) − I ( X ; Z )

  65. private-coin randomized protocol π : ( X , Y ) is sampled according to μ X Y communication transcript Π = Π ( X , Y , r A , r B ) mutual info: I ( XY ; Π ) = H ( XY ) − H ( XY | Π ) the amount of info. about inputs one can get by seeing the contents of communications

  66. Definition The ( external ) information cost of a protocol π is IC µ ( π ) = IC ext µ ( π ) = I ( XY ; Π ) Definition: The information complexity of f is IC µ ( f ) = inf π IC µ ( π ) where π ranges over all private-coin randomized protocols for f with bounded-error on all inputs IC µ ( f ) optimizes over the same protocols as R ( f ) input distribution μ is only used to generate Π

  67. X ranges over s values 0 ≤ H ( X ) ≤ log s subadditivity: H ( X, Y ) ≤ H ( X ) + H ( Y ) equality is achieved if and only if X , Y are independent H ( X, Y | Z ) ≤ H ( X | Z ) + H ( Y | Z ) equality is achieved if and only if X , Y are conditionally independent given Z data processing inequality: if X , Z are conditionally independent given Y I ( X ; Y | Z ) ≤ I ( X ; Y )

  68. IC µ ( f ) = inf π I ( XY ; Π ) where π ranges over all private-coin randomized protocols for f with bounded-error on all inputs ∀ µ, R ( f ) ≥ IC µ ( f ) π : optimal private-coin protocol for f ≥ IC µ ( f ) R ( f ) = CC ( π ) ≥ H ( Π ) ≥ I ( XY ; Π ) X ranges over s values 0 ≤ H ( X ) ≤ log s

  69. are mutually independent Z = ( Z 1 , . . . , Z n ) I ( Z ; Π ) ≥ I ( Z 1 ; Π ) + · · · I ( Z n ; Π ) n ^ NAND( x i , y i ) i x ∈ { 0 , 1 } n y ∈ { 0 , 1 } n each ( X i , Y i ) is distributed independently according to μ : Pr[( x i , y i ) = (0 , 0)] = 1 2 Pr[( x i , y i ) = (0 , 1)] = Pr[( x i , y i ) = (1 , 0)] = 1 4 ( X , Y ) follows the product distribution μ n

  70. n X I ( XY ; Π ) ≥ I ( X i Y i ; Π ) i =1 π : optimal private-coin protocol for DISJ n comm. transcript Π = Π ( X , Y , r A , r B ) ^ NAND( x i , y i ) i X Y each ( X i , Y i ) is distributed independently according to μ : Pr[( x i , y i ) = (0 , 0)] = 1 2 Pr[( x i , y i ) = (0 , 1)] = Pr[( x i , y i ) = (1 , 0)] = 1 4 ( X , Y ) follows the product distribution μ n all possible inputs have DISJ( X , Y )=1 (Is this a problem?)

  71. subadditivity data processing n n X X I ( XY ; Π ) ≥ I ( X i Y i ; Π ) ≥ I ( X i Y i ; Π | D ) i =1 i =1 π : optimal private-coin protocol for DISJ n comm. transcript Π = Π ( X , Y , r A , r B ) ^ NAND( x i , y i ) i X Y each ( X i , Y i ) is distributed independently according to μ : ( sample uniform X i ∈ { 0 , 1 } uniformly random if D i =0 Y i = 0 “switches” D i ∈ {0,1} ( X i = 0 if D i =1 D = ( D 1 , . . . , D n ) Y i ∈ { 0 , 1 } uniformly random X i , Y i are conditionally independent given D i !

  72. I ( X i Y i ; Π | D ) ≥ IC µ (NAND | D i ) ( sample uniform X i ∈ { 0 , 1 } uniformly random if D i =0 Y i = 0 “switches” D i ∈ {0,1} ( X i = 0 if D i =1 D = ( D 1 , . . . , D n ) Y i ∈ { 0 , 1 } uniformly random B NAND( A , B ) A Y 2 X 2 Y n X n for i =1 : I ( X i Y i ; Π | D ) = E d 2 ,...d n [ I ( X i Y i ; Π | D 1 , D 2 = d 2 , . . . , D n = d n )] X i , Y i are independent for i >1 fix any particular D 2 = d 2 , . . . , D n = d n Alice and Bob can sample X i , Y i with private coins so that NAND( A , B ) is solved by Π ( AX 2 ...X n , BY 2 ...Y n ) I ( X 1 Y 1 ; Π | D 1 , D 2 = d 2 , . . . , D n = d n ) ≥ IC µ (NAND | D 1 )

  73. I ( X i Y i ; Π | D ) ≥ IC µ (NAND | D i ) B NAND( A , B ) A Y 2 X 2 Y n X n for i =1 : I ( X i Y i ; Π | D ) = E d 2 ,...d n [ I ( X i Y i ; Π | D 1 , D 2 = d 2 , . . . , D n = d n )] X i , Y i are independent for i >1 fix any particular D 2 = d 2 , . . . , D n = d n Alice and Bob can sample X i , Y i with private coins so that NAND( A , B ) is solved by Π ( AX 2 ...X n , BY 2 ...Y n ) this gives a private-coin protocol θ for NAND with bounded error on all inputs such that I ( AB ; Θ | D 1 ) = I ( X 1 Y 1 ; Π | D 1 , D 2 = d 2 , . . . , D n = d n ) I ( X i Y i ; Π | D 1 , D 2 = d 2 , . . . , D n = d n ) ≥ IC µ (NAND | D 1 )

  74. R ( f ) ≥ IC µ ( f ) n n X X I ( XY ; Π ) ≥ I ( X i Y i ; Π ) ≥ I ( X i Y i ; Π | D ) i =1 i =1 I ( X i Y i ; Π | D ) ≥ IC µ (NAND | D i ) R (DISJ) ≥ IC µ (DISJ) = I ( XY ; Π ) ≥ n · IC µ (NAND | D i ) ( X , Y ) is sampled according to μ X Y comm. transcript Π = Π ( X , Y , r A , r B )

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend