a tight lower bound for entropy flattening
play

A Tight Lower Bound for Entropy Flattening Yi-Hsiu Chen 1 os 1 Salil - PowerPoint PPT Presentation

A Tight Lower Bound for Entropy Flattening Yi-Hsiu Chen 1 os 1 Salil Vadhan 1 Jiapeng Zhang 2 Mika G o 1 Harvard University, USA 2 UC San Diego, USA June 23, 2018 1 / 18 Agenda 1 Problem Definition / Model 2 Cryptographic Motivations 3


  1. A Tight Lower Bound for Entropy Flattening Yi-Hsiu Chen 1 os 1 Salil Vadhan 1 Jiapeng Zhang 2 Mika G¨ o¨ 1 Harvard University, USA 2 UC San Diego, USA June 23, 2018 1 / 18

  2. Agenda 1 Problem Definition / Model 2 Cryptographic Motivations 3 Proof Techniques 2 / 18

  3. Flatness Definition (Entropies) Let X be a distribution over { 0 , 1 } n . Define the surprise of x to be H X ( x ) = log(1 / Pr [ X = x ]) . H sh ( X ) def = x ∼ X [ H X ( x )] , E H min ( X ) def = min x H X ( x ) , H max ( X ) def = log | Supp X | ≤ max H X ( x ) . x H min ( X ) ≤ H sh ( x ) ≤ H max ( X ) (The gap can be Θ( n ) .) A source X is flat iff H sh ( X ) = H min ( X ) = H max ( X ) . 3 / 18

  4. Entropy Flattening Flattening Algorithm A Input source X Output source Y nearly flat ( H sh ( Y ) ≈ H min ( Y ) ≈ H max ( Y ) ) 4 / 18

  5. Entropy Flattening Flattening Algorithm A Input source X Output source Y nearly flat ( H sh ( Y ) ≈ H min ( Y ) ≈ H max ( Y ) ) Entropies of the output and input sources are monotonically related. 4 / 18

  6. Entropy Flattening Flattening Algorithm A Input source X Output source Y nearly flat ( H sh ( Y ) ≈ H min ( Y ) ≈ H max ( Y ) ) Entropies of the output and input sources are monotonically related. max max min flattening min/max Shannon gap sh � gap max sh min min X L X H Y L Y H 4 / 18

  7. Entropy Flattening Entropy Flattening Problem Find an flattening algorithm A : If H sh ( X ) ≥ τ + 1 , then H ε min ( Y ) ≥ k + ∆ . If H sh ( X ) ≤ τ − 1 , then H ε max ( Y ) ≤ k − ∆ . 5 / 18

  8. Entropy Flattening Entropy Flattening Problem Find an flattening algorithm A : If H sh ( X ) ≥ τ + 1 , then H ε min ( Y ) ≥ k + ∆ . If H sh ( X ) ≤ τ − 1 , then H ε max ( Y ) ≤ k − ∆ . Smooth Entropies min ( Y ) ≥ k if ∃ Y ′ s.t. H min ( Y ) ≥ k and d TV ( Y, Y ′ ) ≤ ε . H ε max ( Y ) ≤ k if ∃ Y ′ s.t. H max ( Y ) ≤ k and d TV ( Y, Y ′ ) ≤ ε . H ε 5 / 18

  9. Solution: Repetition Theorem ([HILL99, HR11]) X : a distribution over { 0 , 1 } n . Let Y = ( X 1 , . . . , X q ) where X i s are i.i.d. copies of X . � � � H ε min ( Y ) , H ε max ( Y ) ∈ H sh ( Y ) ± O n q log(1 /ε ) � � � �� log(1 /ε ) q · H sh ( X ) ± O n q (Asymptotic Equipartition Property (AEP) in information theory) 6 / 18

  10. Solution: Repetition Theorem ([HILL99, HR11]) X : a distribution over { 0 , 1 } n . Let Y = ( X 1 , . . . , X q ) where X i s are i.i.d. copies of X . � � � H ε min ( Y ) , H ε max ( Y ) ∈ H sh ( Y ) ± O n q log(1 /ε ) � � � �� log(1 /ε ) q · H sh ( X ) ± O n q (Asymptotic Equipartition Property (AEP) in information theory) q = O ( n 2 ) is sufficient for the constant entropy gap. q = Ω( n 2 ) is needed due to anti-concentration results. [HR11] 6 / 18

  11. Query Model The Model: Input source : encoded by a function f : { 0 , 1 } n → { 0 , 1 } m and defined as f ( U n ) . Flattening algorithm : oracle algorithm A f : { 0 , 1 } n ′ → { 0 , 1 } m ′ has query access to f . Output source : A f ( U n ′ ) . Example : A f ( r 1 , . . . , r q ) = ( f ( r 1 ) , . . . , f ( r q )) 7 / 18

  12. Query Model The Model: Input source : encoded by a function f : { 0 , 1 } n → { 0 , 1 } m and defined as f ( U n ) . Flattening algorithm : oracle algorithm A f : { 0 , 1 } n ′ → { 0 , 1 } m ′ has query access to f . Output source : A f ( U n ′ ) . Example : A f ( r 1 , . . . , r q ) = ( f ( r 1 ) , . . . , f ( r q )) Def: Flattening Algorithm � ≥ τ + 1 � ≥ k + ∆ � f ( U n ) � A f ( U n ′ ) H ε H sh ⇒ min � ≤ τ − 1 � ≤ k − ∆ � f ( U n ) � A f ( U n ′ ) H ε H sh ⇒ max 7 / 18

  13. Query Model The Model: Input source : encoded by a function f : { 0 , 1 } n → { 0 , 1 } m and defined as f ( U n ) . Flattening algorithm : oracle algorithm A f : { 0 , 1 } n ′ → { 0 , 1 } m ′ has query access to f . Output source : A f ( U n ′ ) . Example : A f ( r 1 , . . . , r q ) = ( f ( r 1 ) , . . . , f ( r q )) Def: Flattening Algorithm � ≥ τ + 1 � ≥ k + ∆ � f ( U n ) � A f ( U n ′ ) H ε H sh ⇒ min � ≤ τ − 1 � ≤ k − ∆ � f ( U n ) � A f ( U n ′ ) H ε H sh ⇒ max More powerful: Querying correlated positions or even in an adaptive way. Computation on the query inputs. e.g., hashing 7 / 18

  14. Main Theorems Theorem Flattening algorithms for n -bit oracles f require Ω( n 2 ) oracle queries. 8 / 18

  15. Main Theorems Theorem Flattening algorithms for n -bit oracles f require Ω( n 2 ) oracle queries. Def: SDU Algorithm � ≥ τ + 1 � A f ( U n ′ ) , U m ′ � < ε . � f ( U n ) H sh ⇒ d TV � ≤ τ − 1 � / 2 m ′ ≤ ε . � f ( U n ) � A f ( U n ′ ) H sh ⇒ Supp Flattening Algorithm ⇐ ⇒ SDU Algorithm (Reduction between two NISZK-complete problems [GSV99]) Theorem SDU algorithms for n -bit oracles f require Ω( n 2 ) oracle queries. 8 / 18

  16. Connection to Cryptographic Constructions Example : OWF f → PRG g f ([HILL90, Hol06, HHR06, HRV10, VZ13]): 1 Create a gap between “pseudoentropy” and (true) entropy. 2 Guess the entropy threshold τ (or other tricks). 3 Flatten entropies. 4 Extract the pseudorandomness (via universal hashing). 9 / 18

  17. Connection to Cryptographic Constructions Example : OWF f → PRG g f ([HILL90, Hol06, HHR06, HRV10, VZ13]): 1 Create a gap between “pseudoentropy” and (true) entropy. 2 Guess the entropy threshold τ (or other tricks). ˜ O ( n ) queries 3 Flatten entropies. ˜ O ( n 2 ) queries 4 Extract the pseudorandomness (via universal hashing). Overall, the best PRG makes ˜ O ( n 3 ) queries to the one-way function [HRV10, VZ13]. From regular one-way function, Step 3 is unnecessary, and so ˜ O ( n ) query is sufficient. [HHR06] 9 / 18

  18. Connection to Cryptographic Constructions Example : OWF f → PRG g f ([HILL90, Hol06, HHR06, HRV10, VZ13]): 1 Create a gap between “pseudoentropy” and (true) entropy. 2 Guess the entropy threshold τ (or other tricks). ˜ O ( n ) queries 3 Flatten entropies. ˜ O ( n 2 ) queries 4 Extract the pseudorandomness (via universal hashing). Overall, the best PRG makes ˜ O ( n 3 ) queries to the one-way function [HRV10, VZ13]. From regular one-way function, Step 3 is unnecessary, and so ˜ O ( n ) query is sufficient. [HHR06] Holenstein and Sinha ([HS12]) prove that any black-box construction requires ˜ Ω( n ) queries. (From Step 2. Applicable to regular OWF) 9 / 18

  19. Connection to Cryptographic Constructions Example : OWF f → PRG g f ([HILL90, Hol06, HHR06, HRV10, VZ13]): 1 Create a gap between “pseudoentropy” and (true) entropy. 2 Guess the entropy threshold τ (or other tricks). ˜ O ( n ) queries 3 Flatten entropies. ˜ O ( n 2 ) queries 4 Extract the pseudorandomness (via universal hashing). Overall, the best PRG makes ˜ O ( n 3 ) queries to the one-way function [HRV10, VZ13]. From regular one-way function, Step 3 is unnecessary, and so ˜ O ( n ) query is sufficient. [HHR06] Holenstein and Sinha ([HS12]) prove that any black-box construction requires ˜ Ω( n ) queries. (From Step 2. Applicable to regular OWF) Can we do better in the entropy flattening step? 9 / 18

  20. Overview of the Proof Def: SDU Algorithm � ≥ τ + 1 � A f ( U n ′ ) , U m ′ � < ε . � f ( U n ) H sh ⇒ d TV � ≤ τ − 1 � / 2 m ′ ≤ ε . � f ( U n ) � A f ( U n ′ ) H sh ⇒ Supp 1 Construct distributions D H and D L : Sample f from D H , then H sh ( f ( U n )) ≥ τ + 1 w.h.p. Sample f from D L , then H sh ( f ( U n )) ≤ τ − 1 w.h.p. 2 A cannot “behave very different” on both distributions by making only q = o ( n 2 ) queries. 10 / 18

  21. Construction of f Partition the domain into s blocks, each with t elements ( s · t = 2 n ) Concentrated: map to the same element. Scattered: map to all distinct elements. 2 3 n/ 4 blocks f � �� � . . . { 0 , 1 } n Sca Sca Con Sca Con Sca � �� � 2 n/ 4 elements ↓ { 0 , 1 } m 11 / 18

  22. Construction of f Partition the domain into s blocks, each with t elements ( s · t = 2 n ) Concentrated: map to the same element. Scattered: map to all distinct elements. 2 3 n/ 4 blocks f � �� � . . . { 0 , 1 } n Sca Sca Con Sca Con Sca � �� � 2 n/ 4 elements ↓ { 0 , 1 } m ≥ s · (1 / 2 + 4 /n ) blocks are scattered ⇒ H sh ( f ) ≥ 7 n/ 8 + 1 ≤ s · (1 / 2 − 4 /n ) blocks are scattered ⇒ H sh ( f ) ≤ 7 n/ 8 − 1 11 / 18

  23. D H and D L 2 3 n/ 4 blocks f � �� � . . . { 0 , 1 } n Sca Sca Con Sca Con Sca � �� � 2 n/ 4 elements ↓ { 0 , 1 } m 1 Randomly partition { 0 , 1 } n into 2 3 n/ 4 blocks. 12 / 18

  24. D H and D L 2 3 n/ 4 blocks f � �� � . . . { 0 , 1 } n Sca Sca Con Sca Con Sca � �� � 2 n/ 4 elements ↓ { 0 , 1 } m 1 Randomly partition { 0 , 1 } n into 2 3 n/ 4 blocks. 2 Decide each block to be scattered or concentrated. D H : scattered with probability (1 / 2 + 5 /n ) , then w.h.p, ≥ s · (1 / 2 + 4 /n ) blocks are scattered D L : scattered with probability (1 / 2 − 5 /n ) , then w.h.p, ≤ s · (1 / 2 − 4 /n ) blocks are scattered 12 / 18

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend