a new approach for constructing low error two source
play

A New Approach for Constructing Low-Error Two-Source Extractors - PowerPoint PPT Presentation

A New Approach for Constructing Low-Error Two-Source Extractors DEAN DORON TEL-AVIV UNIVERSITY Joint work with AVRAHAM BEN-AROYA ESHAN CHATTOPADHYAY XIN LI AMNON TA-SHMA Todays talk Two-source extractors and the low-error challenge.


  1. Current constructions of two-source extractors { 0 , 1 } n { 0 , 1 } n { 0 , 1 } n nmE ( x 1 , 1) nmE ( x 1 , 2) . . . X 2 X 2 X 1 . x 1 . . nmE ( x 1 , D )

  2. Current constructions of two-source extractors { 0 , 1 } n { 0 , 1 } n { 0 , 1 } n nmE ( x 1 , 1) nmE ( x 1 , 2) . . . X 2 X 2 X 1 x 2 . x 1 . . nmE ( x 1 , D )

  3. Current constructions of two-source extractors { 0 , 1 } n { 0 , 1 } n { 0 , 1 } n nmE ( x 1 , 1) nmE ( x 1 , 2) D 0 . . . X 2 X 2 X 1 x 2 . x 1 . . nmE ( x 1 , D )

  4. Current constructions of two-source extractors { 0 , 1 } n { 0 , 1 } n { 0 , 1 } n nmE ( x 1 , 1) nmE ( x 1 , 2) D 0 . . . X 2 X 2 X 1 x 2 . x 1 . . nmE ( x 1 , D )

  5. Current constructions of two-source extractors { 0 , 1 } n { 0 , 1 } n { 0 , 1 } n nmE ( x 1 , 1) nmE ( x 1 , 3) nmE ( x 1 , 2) nmE ( x 1 , 7) D 0 . . . X 2 . X 2 X 1 . . x 2 . x 1 . . nmE ( x 1 , D )

  6. Current constructions of two-source extractors { 0 , 1 } n { 0 , 1 } n { 0 , 1 } n nmE ( x 1 , 1) nmE ( x 1 , 3) nmE ( x 1 , 2) nmE ( x 1 , 7) D 0 . . . X 2 . X 2 X 1 . . x 2 . x 1 . . nmE ( x 1 , D ) f

  7. Current constructions of two-source extractors { 0 , 1 } n { 0 , 1 } n { 0 , 1 } n nmE ( x 1 , 1) nmE ( x 1 , 3) nmE ( x 1 , 2) nmE ( x 1 , 7) D 0 . . . X 2 . X 2 X 1 . . x 2 . x 1 . . nmE ( x 1 , D ) f ≈ U 1

  8. Resilient functions

  9. Resilient functions The sampled table is close to being uniform and t -wise independent in the good rows.

  10. Resilient functions The sampled table is close to being uniform and t -wise independent in the good rows. We need f to be resilient : Say we have D’ players. ε -fraction of them are malicious, and the rest are t -wise independent and uniform.

  11. Resilient functions The sampled table is close to being uniform and t -wise independent in the good rows. We need f to be resilient : Say we have D’ players. ε -fraction of them are malicious, and the rest are t -wise independent and uniform. The honest players draw their random bit and later the malicious players draw as they wish.

  12. Resilient functions The sampled table is close to being uniform and t -wise independent in the good rows. We need f to be resilient : Say we have D’ players. ε -fraction of them are malicious, and the rest are t -wise independent and uniform. The honest players draw their random bit and later the malicious players draw as they wish. With high probability, the outcome has small bias — the malicious players cannot substantially bias the outcome.

  13. The bottleneck

  14. The bottleneck A corollary of [KKL88] — even one malicious player can bias the output with probability at least log D’ / D’ .

  15. The bottleneck A corollary of [KKL88] — even one malicious player can bias the output with probability at least log D’ / D’ . We cannot hope for an error smaller than 1/ D ’, and D ’ is the size of our table.

  16. The bottleneck A corollary of [KKL88] — even one malicious player can bias the output with probability at least log D’ / D’ . We cannot hope for an error smaller than 1/ D ’, and D ’ is the size of our table. Thus, the running time is at least 1/ ε .

  17. Today’s talk Two-source extractors and the low-error challenge. Seeded and non-malleable extractors. Current constructions of two-source extractors via non-malleable extractors and where they fail in achieving small error. Constructing low-error two-source extractors given “good” non-malleable extractors.

  18. Getting a small error

  19. Getting a small error We should abandon resilient functions if we want to get a small error.

  20. Getting a small error We should abandon resilient functions if we want to get a small error. In current constructions, we need the sampled set to contain many good rows.

  21. Getting a small error We should abandon resilient functions if we want to get a small error. In current constructions, we need the sampled set to contain many good rows. Instead of trying to sample and then employ t -wise independence in the good rows, let’s just try and hit a good row — a weaker sampling guarantee.

  22. Getting a small error We should abandon resilient functions if we want to get a small error. In current constructions, we need the sampled set to contain many good rows. Instead of trying to sample and then employ t -wise independence in the good rows, let’s just try and hit a good row — a weaker sampling guarantee. We hit with a disperser.

  23. Dispersers { 0 , 1 } n = [ N ] { 0 , 1 } m = [ M ] | Γ ( A, [ D ]) | > K 0 B | A | ≥ K A

  24. Dispersers { 0 , 1 } n = [ N ] Γ :{0,1} n × [ D ] → {0,1} m is a { 0 , 1 } m = [ M ] ( K , K ’)-disperser if for every set A of cardinality at least K , Γ maps A to a set of | Γ ( A, [ D ]) | > K 0 cardinality greater than K ’. B | A | ≥ K A

  25. Dispersers { 0 , 1 } n = [ N ] Γ :{0,1} n × [ D ] → {0,1} m is a { 0 , 1 } m = [ M ] ( K , K ’)-disperser if for every set A of cardinality at least K , Γ maps A to a set of | Γ ( A, [ D ]) | > K 0 cardinality greater than K ’. B We are interested in the | A | ≥ K case where K ’ is small A compared to 2 m . That is, we want to avoid small bad sets.

  26. Dispersers { 0 , 1 } n = [ N ] [RT]: When K ’ is not too { 0 , 1 } m = [ M ] large, say K ’= ε M , the lower bound on the degree is | Γ ( A, [ D ]) | > K 0 ! log N K D = Ω B log 1 | A | ≥ K ε A

  27. Explicit disperser

  28. Explicit disperser Quite amazingly, when K = N 𝜀 for a constant 𝜀 <1 (alternatively, for entropy k = 𝜀 n ), there exist explicit constructions that achieve this bound [BKSSW 05, Raz 05, Zuckerman 06].

  29. Explicit disperser Quite amazingly, when K = N 𝜀 for a constant 𝜀 <1 (alternatively, for entropy k = 𝜀 n ), there exist explicit constructions that achieve this bound [BKSSW 05, Raz 05, Zuckerman 06]. The key ingredient in Zuckerman’s beautiful construction: a points-lines incidence graph.

  30. Explicit disperser Quite amazingly, when K = N 𝜀 for a constant 𝜀 <1 (alternatively, for entropy k = 𝜀 n ), there exist explicit constructions that achieve this bound [BKSSW 05, Raz 05, Zuckerman 06]. The key ingredient in Zuckerman’s beautiful construction: a points-lines incidence graph. Gives sub-optimal results also for lower k -s, where 𝜀 is sub-constant.

  31. Our reduction

  32. Our reduction We are given a source X 1 over {0,1} n 1 with entropy k 1 and a source X 2 over {0,1} n 2 with min-entropy k 2 .

  33. Our reduction We are given a source X 1 over {0,1} n 1 with entropy k 1 and a source X 2 over {0,1} n 2 with min-entropy k 2 . Ingredients: nmE: {0,1} n 1 × [ D ] → {0,1} m , a t - n.m. extractor with error ε . Γ : {0,1} n 2 × [ t +1 ] → [ D ], a ( ε K 2 , ε D )-disperser.

  34. Our reduction We are given a source X 1 over {0,1} n 1 with entropy k 1 and a source X 2 over {0,1} n 2 with min-entropy k 2 . Ingredients: nmE: {0,1} n 1 × [ D ] → {0,1} m , a t - n.m. extractor with error ε . Γ : {0,1} n 2 × [ t +1 ] → [ D ], a ( ε K 2 , ε D )-disperser. On input x 1 , x 2 , output ⊕ i ∈ [t+1] nmE( x 1 , Γ ( x 2 , i )).

  35. Our reduction [ N 2 ] [ N 1 ] X 2 X 1

  36. Our reduction [ N 2 ] [ N 1 ] X 2 X 1 x 1

  37. Our reduction [ N 2 ] [ N 1 ] X 2 X 1 x 1

  38. Our reduction [ N 2 ] [ N 1 ] [ N 2 ] nmE ( x 1 , 1) nmE ( x 1 , 2) . . . X 2 X 2 X 1 . x 1 . . nmE ( x 1 , D )

  39. Our reduction [ N 2 ] [ N 1 ] [ N 2 ] nmE ( x 1 , 1) nmE ( x 1 , 2) . . . X 2 X 2 X 1 x 2 . x 1 . . nmE ( x 1 , D )

  40. Our reduction [ N 2 ] [ N 1 ] [ N 2 ] nmE ( x 1 , 1) nmE ( x 1 , 2) t + 1 . . . X 2 X 2 X 1 x 2 . x 1 . . Γ nmE ( x 1 , D )

  41. Our reduction [ N 2 ] [ N 1 ] [ N 2 ] nmE ( x 1 , 1) nmE ( x 1 , 2) t + 1 . . . X 2 X 2 X 1 x 2 . x 1 . . Γ nmE ( x 1 , D )

  42. Our reduction [ N 2 ] [ N 1 ] [ N 2 ] nmE ( x 1 , 1) nmE ( x 1 , 3) nmE ( x 1 , 2) t + 1 nmE ( x 1 , 7) . . . X 2 . X 2 X 1 . . x 2 . x 1 . . Γ nmE ( x 1 , D )

  43. Our reduction [ N 2 ] [ N 1 ] [ N 2 ] nmE ( x 1 , 1) nmE ( x 1 , 3) nmE ( x 1 , 2) t + 1 nmE ( x 1 , 7) . . . X 2 . X 2 X 1 . . x 2 . x 1 . . Γ nmE ( x 1 , D ) M

  44. Our reduction [ N 2 ] [ N 1 ] [ N 2 ] nmE ( x 1 , 1) nmE ( x 1 , 3) nmE ( x 1 , 2) t + 1 nmE ( x 1 , 7) . . . X 2 . X 2 X 1 . . x 2 . x 1 . . Γ nmE ( x 1 , D ) M ≈ U 1

  45. Our reduction [ N 2 ] [ N 1 ] [ N 2 ] nmE ( x 1 , 1) nmE ( x 1 , 3) nmE ( x 1 , 2) t + 1 nmE ( x 1 , 7) . . . X 2 . X 2 X 1 . . x 2 . x 1 . . Γ nmE ( x 1 , D ) M e ! r e h s n o i t c n u f t n e i l i s e r o N ≈ U 1

  46. Correctness overview

  47. Correctness overview The source X 1 defines a set of good and bad seeds for the n.m. extractor. Let G be the set of good seeds, of density at least 1- ε .

  48. Correctness overview The source X 1 defines a set of good and bad seeds for the n.m. extractor. Let G be the set of good seeds, of density at least 1- ε . Γ is a ( ε K 2 , ε D )-disperser, so the number of elements x 2 for which Γ ( x 2 ,[t+1]) contains only bad seeds is at most ε K 2.

  49. Correctness overview The source X 1 defines a set of good and bad seeds for the n.m. extractor. Let G be the set of good seeds, of density at least 1- ε . Γ is a ( ε K 2 , ε D )-disperser, so the number of elements x 2 for which Γ ( x 2 ,[t+1]) contains only bad seeds is at most ε K 2. Thus, with probability at least 1- ε K 2 / K 2 =1- ε , the input x 2 samples t +1 seeds of nmE, one of which, y , is good.

  50. Correctness overview

  51. Correctness overview t u p t u o , x , x t u p n i 2 n O 1 ) ) i , x ( Γ , x ( E m 2 n 1 ⊕ i ∈ [t+1]

  52. Correctness overview In such a case, nmE( X , y ) is ε -close to uniform, even condition on t arbitrary outputs! This is since: t u p t u o , x , x t u p n i 2 n O 1 ) ) i , x ( Γ , x ( E m 2 n 1 ⊕ i ∈ [t+1]

  53. Correctness overview In such a case, nmE( X , y ) is ε -close to uniform, even condition on t arbitrary outputs! This is since: For every y ∈ G and any y 1 ,…, y t ∈ {0,1} d \{ y } it holds that (nmE( X , y ),nmE( X , y 1 ),…,nmE( X , y t )) is ε -close to ( U ,nmE( X , y 1 ),…,nmE( X , y t )) . t u p t u o , x , x t u p n i 2 n O 1 ) ) i , x ( Γ , x ( E m 2 n 1 ⊕ i ∈ [t+1]

  54. Correctness overview In such a case, nmE( X , y ) is ε -close to uniform, even condition on t arbitrary outputs! This is since: For every y ∈ G and any y 1 ,…, y t ∈ {0,1} d \{ y } it holds that (nmE( X , y ),nmE( X , y 1 ),…,nmE( X , y t )) is ε -close to ( U ,nmE( X , y 1 ),…,nmE( X , y t )) . Hence, the parity of the sampled random variables is also close to 
 uniform, and the overall 
 t u p t u o , x , x t u p n i 2 n O 1 error is 2 ε . ) ) i , x ( Γ , x ( E m 2 n 1 ⊕ i ∈ [t+1]

  55. Our reduction So, if the n.m. extractor can support small error (and existing constructions can), we get a construction with a small error.

  56. Our reduction

  57. Our reduction The parity is not resilient… What happened here? We proposed a di ff erent approach:

  58. Our reduction The parity is not resilient… What happened here? We proposed a di ff erent approach: Instead of sampling D ’ rows from the table and applying a resilient function, we pick a drastically smaller sample set — of size t +1.

  59. Our reduction The parity is not resilient… What happened here? We proposed a di ff erent approach: Instead of sampling D ’ rows from the table and applying a resilient function, we pick a drastically smaller sample set — of size t +1. Instead of requiring that the number of malicious players is small, we have the weaker requirement that not all of the players in our sample set are malicious.

  60. But does it work?

  61. But does it work? Or, when does it work? We have no option but to look closer into the parameters.

  62. But does it work? Or, when does it work? We have no option but to look closer into the parameters. A potential circular hazard: The degree of Γ should be at most t +1, but The degree of Γ also depends on the seed length of the n.m. extractor, which in turn depends on t …

  63. Our result

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend