part ii pseudorandom correlation generators
play

Part II: Pseudorandom Correlation Generators What are they? How - PowerPoint PPT Presentation

Part II: Pseudorandom Correlation Generators What are they? How can we build them? Re Recall: : Succinct Secure Computation from HSS x Exchange additive Eval C Share output shares y 0 w 0 = C(x,x) w + w 1 y 1 Eval C x


  1. Part II: Pseudorandom Correlation Generators • What are they? • How can we build them?

  2. Re Recall: : Succinct Secure Computation from HSS x Exchange additive Eval C Share output shares y 0 w 0 = C(x,x’) w + w 1 y 1 Eval C x’ Securely compute Share(x,x’)

  3. What if Additive Shares ARE the Output Goal? x Exchange additive Eval C Share output shares y 0 w 0 = C(x,x’) w + w 1 y 1 Eval C x’ Securely compute Share(x,x’) Final exchange not needed!

  4. Is this ever actually desired?

  5. Secure Computation with Preprocessing [Beaver ’91] Dominates overall cost Interactive Preprocessing Correlated randomness protocol 𝑦 𝑧 Online phase • Information-theoretic • Constant comp. and comm. 𝑔(𝑦, 𝑧) overhead 45

  6. Secure Computation with Silent Preprocessing [BCGI 18, BCGIKS 19] Setup “Small” setup Correlated, short seeds functionality protocol • Less communication Silent • Lower storage costs expansion Correlated pseudorandomness Online phase 𝑦 𝑧 𝑔(𝑦, 𝑧) 46

  7. “Silent” Generation of Large Correlations x Eval C Share y 0 w 0 w w 1 y 1 Eval C x’ Securely compute Share(x,x’)

  8. Pseudorandom correlation generators

  9. Standard Pseudorandom Generators (PRG) Short random seed s PRG Long Pseudorandomness ∀ PPT distinguisher: PRG(s) ≈ Random

  10. on Generators ( PC Pseudorandom Cor Correlation PCG ) Short seeds s 0 s 1 P C G 0 P C G 1 Long Pseudorandom X 0 Long Pseudorandom X 1 Expand to long correlated outputs ≈ 𝑌 * , 𝑌 + ← 𝐸 Challenge : 𝑡 / should give no information on 𝑌 +0/ beyond 𝑌 /

  11. Constructions of PCGs [GI 99], [CDI 05] [ B CGI 18] [ B CGIKS 19] Multi-party bilinear (LPN, LWE) truth tables Multi-party linear correlations vector-OLE constant-degree (PRG) (PRG) (LPN) poly (LPN) OT low-degree via HSS (LPN) (LWE, pairings, MQ) [ B CGIKRS 19] Also: [ B CGIO 17], [S 18] (less practical)

  12. Generic PCG construction for “Additive” Correlations from HSS

  13. Additive Correlations Beaver Triples K0 (a b ab) … Authenticated Beaver Triples Distribution R 𝜏 (a b ab) 𝜏 (a b ab) … Truth Table Correlations K1 f(x –r1, y–r2) … Additive shares …

  14. Homomorphic Secret Sharing (HSS) For program class P x • Security: x i hides x Share • Size: |x i | ~ |x| x 0 x 1 Eval P Eval P • Correctness: Eval P (x 0 ) + Eval P (x 1 ) = P (x) y 0 y 1 + = P(x)

  15. HSS ⇒ PCG for Additive Correlations [BCGIO17] s Consider program P: Share s s 0 s 1 PRG expansion Long Pseudorandomness Eval P Eval P Sampling from R Distribution R K0 K1 Question: Concretely efficient HSS for PRG?

  16. Landscape of HSS – Concrete Efficiency “High-level” Builds on top LWE+ Circuits [DHRW16, BGI15, BGILT18] of FHE… Lightweight HSS for simple computations “Mid-level” DDH Branching Programs [BGI16, BCGIO17, DKK18] Paillier Branching Programs [FGJS17] Faster… LWE Branching Programs [BKS19] ~200 million Evals per second [GKWY19] Growing number of applications… “Low-level” OWF Point Functions, [GI14, BGI15, BGI16b] Fast! Conjunctions, Intervals, Decision Trees “Algorithmica” None Linear Functions [Ben86]

  17. Concrete Efficiency…? Yes! Concretely efficient PCGs based on LPN For Today: • “ Vector OLE ” Correlation • “ Oblivious Transfer ” Correlation

  18. Oblivious Linear Evaluation (OLE) [NP99] & Ve Vector OLE • Enables secure computation of arithmetic circuits / vector operations x a, b OLE 𝔾 ax + b a x b VOLE 𝔾 ax + b

  19. Goal: PCG for Vector OLE Correlation • Pseudorandom vectors a,b • Pseudorandom field element x and ax+b Short seeds s 0 s 1 PCG 0 PCG 1 x a b ax + b

  20. Learning Parity with Noise (LPN) over 𝔾 (LWE with low-Hamming noise) Random 𝔾 elements + secret Sparse noise Public M ≈ (Even given M) Uniform Note: Parameterized by M & by noise distribution

  21. Our LPN Regime • LPN over 𝔾 : Currently no better attacks than over 𝔾 7 • High dimension k secret • Low-noise (noise rate 1/𝑙 ; for some constant ε) Sparse noise • Bounded number of samples In this regime: • No improvement known over the standard Gaussian elimination attack (guessing noise-free coordinates) • Not known to imply public-key encryption

  22. Learning Parity with Noise (LPN) over 𝔾 Idea: Leverage linearity to reduce problem to sparse case + secret Sparse noise Public M ≈ (Even given M) Uniform

  23. Primal Construction • Start with a short VOLE correlation x c d cx + d

  24. Primal Construction • Start with a short VOLE correlation … and expand using M C = c x Public M D = d cx + d Public M Public M = Cx + D VOLE security: ❌ VOLE correctness ✔ C, D distinguishable from random!

  25. Primal Construction C = c x Public M + a’ Sparse noise D = d cx + d Public M Public M + + b’ a’x + b’ = Cx + D VOLE security: ✔ VOLE correctness ✔

  26. Primal ConstrucQon LPN ⇒ Suffices to compress This distribution C = c Public M x + a’ Sparse noise D = d cx + d Public M Public M Secret shares + + b’ a’x + b’ of a’x = Cx + D

  27. Compressing Sparse Correlations (note: a’ can be represented succinctly) a’ x Wanted: K0 K1 b’ a’x+b’ Secret shares of a’x

  28. Idea: Use Compressing Sparse Correlations “Punctured PRFs” (note: a’ can be represented succinctly) a’ x Wanted: K0 K1 b’ a’x+b’ Secret shares String that differs in Long of a’x 1 position Pseudorandom (location of a’) string What if it had been ONE nonzero value?

  29. GGM Pseudo-Random Function (PRF) [Goldreich-Goldwasser-Micali 84] s s PRG PRG Pseudorandomness PRG PRG Very Long Pseudorandomness

  30. Punctured Psuedorandom Functions Pu [Boneh-Waters’13, Kiayias-Papadopoulos-Triandopoulos-Zacharias’13, Boyle-Goldwasser-Ivan’13] x*-Punctured PRF key s*: s s* Can evaluate PRF on all points except x … Give evaluations of all Sibling nodes! Secret shares of Size = 𝜇 x tree depth

  31. Primal Construction: All the Pieces Small VOLE x c d cx + d a’ Sparse a’ S S* Punctured PRFs: Multi-point fn a’x + C = + c cx+d PRF-Eval(S) Public M Public M D = + d PRF-Eval(S*) = Cx + D Public M

  32. Dual InterpretaQon LPN: Du + Sparse noise secret Kernel of G Public G (Parity- check H) = random noisy codeword Kernel of G ≈ Uniform (Parity- check H)

  33. Dual Interpretation LPN: Du + Sparse noise secret Kernel of G Public G (Parity- check H) = random noisy codeword ≈ Uniform

  34. Dual Interpretation LPN: Du Sparse noise Kernel of G (Parity- check H) ≈ Uniform

  35. Dual InterpretaQon LPN: Du ≈ Sparse noise Uniform Kernel of G (Parity-check H) LPN hard for G ⇒ compressing noise vector by kernel H is pseudorandom

  36. Dual Construction a’ x Sparse a’ & x Punctured PRF: S S* at points (a’x) C = Public H PRF-Eval(S) Public H = Cx + D D = PRF-Eval(S*) Public H

  37. Recap: PCGs from LPN “Primal” construction “Dual” construction 𝑜 𝑛 𝑓 𝐼 (𝑛 − 𝑜) 𝑡 + (𝑡, 𝑓) 𝑛 𝐵 𝑓 𝑓 • Security: both equiv. to LPN (if 𝐼 is parity-check matrix of code 𝐵 ) • Increase 𝑛 ⇒ increase 𝑜 or 𝐼𝑋(𝑓) Arbitrary poly stretch (increase 𝑛 , fix 𝐼𝑋(𝑓) ) Limited to quadratic stretch ⇒ best awack: exp(𝐼𝑋 𝑓 )

  38. A brief note on generayng OT correlayons

  39. Oblivious Transfer & OT Correlation b ∈ {0,1} 𝑌 * , 𝑌 + OT 𝑌 / • Complete primitive for general secure computation [Kilian] Preprocessing OT Correlation OT Correlayon 𝑐, 𝑌 / … 𝑌 * , 𝑌 + …

  40. Oblivious Transfer Correlation Prior work: “Silent OT” Large (linear) Preprocessing OT Correlayon OT Correlayon [BCGIKS19] communication 𝑐, 𝑌 / … 𝑌 * , 𝑌 + … • Problem: Need many OTs… and OT is expensive (“public-key”) • OT extension: Many OTs from a few base OTs + symmetric crypto [IKNP??] • Problem: Large communication 𝑃(𝑜𝜇) for 𝑜 OTs • “Silent” OT extension [BCGIKS19]: Communication sublinear in 𝑜

  41. “Silent” OT Extension: Securely Generating Seeds • 2-round secure seed generation protocol (building from Vector OLE) • Hash the vector OLE outputs to destroy unwanted correlations (similar to [IKNP03]) • Active security: • Lightweight PPRF consistency checks for malicious sender • Allows selective failure attacks – sender can guess 1 bit of LPN error • Assume problem is hard with 1-bit leakage • 10-20% overhead on top of semi-honest • Implementation: • Main challenge: fast mult. by 𝐼 • Quasi-cyclic 𝐼 : polynomial mult. mod 𝑌 O − 1 • Security based on quasi-cyclic syndrome decoding / ring-LPN

  42. RunQmes (ms) for n =10 million random OTs 128854 100000 13728 47x 10000 5x 2726 2756 2441 1000 9x 268 100 10 1 LAN (10 Gbps) WAN (100 MBps) WAN (10 MBps) IKNP OT Extension vs Silent OT Extension Total comm: 160 MB vs 127 kB

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend