Computer-Aided Privacy Proofs C esar Kunz Joint work with Gilles - - PowerPoint PPT Presentation

computer aided privacy proofs
SMART_READER_LITE
LIVE PREVIEW

Computer-Aided Privacy Proofs C esar Kunz Joint work with Gilles - - PowerPoint PPT Presentation

Computer-Aided Privacy Proofs C esar Kunz Joint work with Gilles Barthe, Benjamin Gr egoire, Santiago Zanella-B eguelin 2012.07.10 Provable Privacy Workshop 2012 Privacy for Statistical Databases Privacy for Statistical Databases


slide-1
SLIDE 1

Computer-Aided Privacy Proofs

C´ esar Kunz

Joint work with Gilles Barthe, Benjamin Gr´ egoire, Santiago Zanella-B´ eguelin 2012.07.10 Provable Privacy Workshop 2012

slide-2
SLIDE 2

Privacy for Statistical Databases

slide-3
SLIDE 3

Privacy for Statistical Databases

Maximize Privacy

slide-4
SLIDE 4

Privacy for Statistical Databases

Maximize Privacy Maximize Utility

Conflicting requirements Sanitizing queries requires to strike a good balance

slide-5
SLIDE 5

Differential Privacy [Dwork et al. 06]

K

Fix a (symmetric) adjacency relation Φ on databases Fix a privacy budget ǫ A randomized algorithm K : D → R (called mechanism) is ǫ-differentially private iff for all D1, D2 s.t. Φ(D1, D2) ∀S ⊆ R. Pr[K(D1) ∈ S] ≤ exp(ǫ) × Pr[K(D2) ∈ S]

slide-6
SLIDE 6

Differential Privacy [Dwork et al. 06]

K

Fix a (symmetric) adjacency relation Φ on databases Fix a privacy budget ǫ A randomized algorithm K : D → R (called mechanism) is (ǫ, δ)-differentially private iff for all D1, D2 s.t. Φ(D1, D2) ∀S ⊆ R. Pr[K(D1) ∈ S] ≤ exp(ǫ) × Pr[K(D2) ∈ S] + δ

Still an information-theoretic definition

slide-7
SLIDE 7

Achieving Differential-Privacy

Consider a numerical query f : D → R Define the sensitivity of f as ∆(f )

def

= max

D1,D2|Φ(D1,D2) |f (D1) − f (D2)|

The mechanism K(D)

def

= f (D) + Lap(∆(f )/ǫ) is ǫ-differentially private Pr[K(D) = x] ∝ exp(−|f (D) − x|ǫ/∆(f )) The Exponential Mechanism generalizes this to arbitrary domains

slide-8
SLIDE 8

Computer-Aided Crypto Proofs

Game G0 : . . . . . . ← A( . . . ); . . . Pr[G0 : E0] Game G1 : . . . . . . . . . ≤ h1(Pr[G1 : E1]) · · · Game Gn : . . . . . . ← B( . . . ) . . . ≤ . . . ≤ hn(Pr[Gn : En])

Computer-aided (computational) crypto proofs, a success story

CertiCrypt/EasyCrypt provers Many examples: Cramer-Shoup, OAEP, FDH, ZK-PoK, Boneh-Franklin IBE, Merkle-Damg˚ ard, ZAEP, AKE ... Best paper at CRYPTO’11 Q: Can we extend these techniques to reason about privacy? A: Yes, in this talk we will see how

slide-9
SLIDE 9

Computer-Aided Crypto Proofs

Game G0 : . . . . . . ← A( . . . ); . . . Pr[G0 : E0] Game G1 : . . . . . . . . . ≤ h1(Pr[G1 : E1]) · · · Game Gn : . . . . . . ← B( . . . ) . . . ≤ . . . ≤ hn(Pr[Gn : En])

Computer-aided (computational) crypto proofs, a success story

CertiCrypt/EasyCrypt provers Many examples: Cramer-Shoup, OAEP, FDH, ZK-PoK, Boneh-Franklin IBE, Merkle-Damg˚ ard, ZAEP, AKE ... Best paper at CRYPTO’11 Q: Can we extend these techniques to reason about privacy? A: Yes, in this talk we will see how

slide-10
SLIDE 10

Probabilistic While Language

C ::= skip nop | C; C sequence | V ← E assignment | V

$

← D random sampling | if E then C else C conditional | while E do C while loop | V ← P(E, . . . , E) procedure call x

$

← d: sample the value of x according to distribution d The denotation of a program c is a function from an initial state to a (sub-)distribution over final states: c : M → Distr(M) Programs that do not terminate absolutely generate sub-distributions with total probability mass < 1

slide-11
SLIDE 11

Relational Hoare Logic

Hoare Logic Judgments: {P} c {Q} Assertions: P, Q are predicates over program states Validity: if (c, m) ⇓ m′ and m P, then m′ Q Relational Hoare Logic (RHL) Judgments: ⊢ c1 ∼ c2 : P ⇒ Q Assertions: P, Q are relations over program states Validity: if (c1, m1) ⇓ m′

1 and (c2, m2) ⇓ m′ 2, and

(m1, m2) P, then (m′

1, m′ 2) Q

slide-12
SLIDE 12

Relational Hoare Logic

Hoare Logic Judgments: {P} c {Q} Assertions: P, Q are predicates over program states Validity: if (c, m) ⇓ m′ and m P, then m′ Q Relational Hoare Logic (RHL) Judgments: ⊢ c1 ∼ c2 : P ⇒ Q Assertions: P, Q are relations over program states Validity: if (c1, m1) ⇓ m′

1 and (c2, m2) ⇓ m′ 2, and

(m1, m2) P, then (m′

1, m′ 2) Q

slide-13
SLIDE 13

Probabilistic Relational Hoare Logic (pRHL)

Judgments: ⊢ c1 ∼ c2 : P ⇒ Q where P, Q are binary relations over states (like in the deterministic case) Validity: if (m1, m2) P, then (c1 m1, c2 m2) Q♯ where Q♯ is the lifting of Q to a relation over distributions. Inequalities about probabilities can be inferred from valid pRHL judgments: If (m1, m2) P, and (m′

1, m′ 2) Q implies m′ 1 A =

⇒ m′

2 B,

then Pr[c1, m1 : A] ≤ Pr[c2, m2 : B] Other forms of inequalities can be captured using relational logic (e.g. Fundamental Lemma)

slide-14
SLIDE 14

Approximate Probabilistic Relational Hoare Logic

Judgments: ⊢ c1 ∼α,δ c2 : P ⇒ Q Validity: Requires a novel generalization of the lifting of pRHL What can be inferred about a valid judgment? If (m1, m2) P and (m′

1, m′ 2) Q implies m′ 1 A =

⇒ m′

2 B,

then Pr[c1, m1 : A] ≤ α × Pr[c2, m2 : B] + δ Exactly what we need to encode DP!

slide-15
SLIDE 15

Approximate Probabilistic Relational Hoare Logic

Judgments: ⊢ c1 ∼α,δ c2 : P ⇒ Q Validity: Requires a novel generalization of the lifting of pRHL What can be inferred about a valid judgment? If (m1, m2) P and (m′

1, m′ 2) Q implies m′ 1 A =

⇒ m′

2 B,

then Pr[c1, m1 : A] ≤ α × Pr[c2, m2 : B] + δ Exactly what we need to encode DP!

slide-16
SLIDE 16

Example: Private 2-Party Computation

Two hospitals hold record of some patient’s recent blood tests Want to check whether the patient is asking for similar tests John Doe LDL 1 HDL 1 HIV GLU 1 LEU 1 John Doe LDL HDL HIV 1 GLU 1 LEU 1

slide-17
SLIDE 17

Example: Private 2-Party Computation

Two hospitals hold record of some patient’s recent blood tests Want to check whether the patient is asking for similar tests John Doe LDL 1 HDL 1 HIV GLU 1 LEU 1

. . .

  • a
  • b

h( a, b)

John Doe LDL HDL HIV 1 GLU 1 LEU 1

slide-18
SLIDE 18

Example: Private 2-Party Computation

Using additive homomorphic encryption (e.g. Paillier)

slide-19
SLIDE 19

Example: Private 2-Party Computation

Using additive homomorphic encryption (e.g. Paillier) E(bi) i = 1 . . . n

slide-20
SLIDE 20

Example: Private 2-Party Computation

Using additive homomorphic encryption (e.g. Paillier) E(bi) i = 1 . . . n ai ? ci ← E(bi) : ci ← E(bi) hA ←

i ci + noiseA

hA

slide-21
SLIDE 21

Example: Private 2-Party Computation

Using additive homomorphic encryption (e.g. Paillier) E(bi) i = 1 . . . n ai ? ci ← E(bi) : ci ← E(bi) hA ←

i ci + noiseA

hA ˜ hA ← D(hA) (= h(a, b) + noiseA) hB ← ˜ hA + noiseB hB ˜ hB ← hB − noiseA (= h(a, b) + noiseB)

slide-22
SLIDE 22

Private Hamming Distance Computation

Proofs in the semi-honest model (honest but curious) Some subtleties ignored: truncated noise

Theorem

If the underlying homomorphic scheme is IND-CPA secure, then the protocol is ǫ-SIM-CDP w.r.t. party A when noiseB = Lap(1/ǫ) The protocol is ǫ-DP w.r.t. party B when noiseA = Lap(1/ǫ)

slide-23
SLIDE 23

EasyCrypt: Automated crypto (and privacy) proofs

ProofGeneral Frontend EasyCrypt Toplevel Emacs Shell Why3 Software Verification Platform Why3 API SMT Solvers Alt-Ergo CVC3 Z3 Yices Automated Provers Vampire E-Prover SPASS Interactive Provers Coq

http://easycrypt.gforge.inria.fr