Smooth Sensitivity and Sampling
CompSci 590.03 Instructor: Ashwin Machanavajjhala
1 Lecture 7 : 590.03 Fall 12
Smooth Sensitivity and Sampling CompSci 590.03 Instructor: Ashwin - - PowerPoint PPT Presentation
Smooth Sensitivity and Sampling CompSci 590.03 Instructor: Ashwin Machanavajjhala Lecture 7 : 590.03 Fall 12 1 Project Topics 2-3 minute presentations about each project topic. 1-2 minutes of questions about each presentation. Lecture
CompSci 590.03 Instructor: Ashwin Machanavajjhala
1 Lecture 7 : 590.03 Fall 12
Lecture 7 : 590.03 Fall 12 2
3 Lecture 7 : 590.03 Fall 12
0.2 0.4 0.6
2 4 6 8 10
Laplace Distribution – Lap(λ)
Database
Researcher
Query q
True answer
q(d) q(d) + η η
4 Lecture 7 : 590.03 Fall 12
[Dwork et al., TCC 2006] Thm: If sensitivity of the query is S, then the following guarantees ε- differential privacy.
Sensitivity: Smallest number s.t. for any d, d’ differing in one entry, || q(d) – q(d’) || ≤ S(q)
5 Lecture 7 : 590.03 Fall 12
– Salary can be anywhere between $200 to $200,000
Lecture 7 : 590.03 Fall 12 6
Sensitivity of qmed = Λ
– d1 = {0, 0, 0, 0, 0, Λ, Λ, Λ, Λ, Λ} – qmed(d1) = 0 – d2 = {0, 0, 0, 0, Λ, Λ, Λ, Λ, Λ, Λ} – qmed(d2) = Λ
7 Lecture 7 : 590.03 Fall 12
edge weights = Λ. Cost of MST = 3Λ
Lecture 7 : 590.03 Fall 12 8
Λ Λ Λ Λ Λ
following function is minimized.
Lecture 7 : 590.03 Fall 12 9
Lecture 7 : 590.03 Fall 12 10
x4 ≤ qmed(d’) ≤ x6 Sensitivity of qmed at d = max(x5 – x4, x6 – x5) << Λ
11 Lecture 7 : 590.03 Fall 12
[Nissim et al., STOC 2007] Smallest number s.t. for any d’ differing in one entry from d, || q(d) – q(d’) || ≤ LSq(d) Sensitivity = Global sensitivity S(q) = maxd LSq(d) Can we add noise proportional to local sensitivity?
12 Lecture 7 : 590.03 Fall 12
13 Lecture 7 : 590.03 Fall 12
qmed(d1) = 0 LSqmed(d1) = 0 => Noise sampled from Lap(0)
qmed(d2) = 0 LSqmed(d2) = Λ => Noise sampled from Lap(Λ/ε)
Pr[answer > 0 | d2] > 0 Pr[answer > 0 | d1] = 0 Pr[answer > 0 | d2] > 0 Pr[answer > 0 | d1] = 0 implies
14 Lecture 7 : 590.03 Fall 12
LSqmed(d1) = 0 & LSqmed(d2) = Λ implies S(LSq(.)) ≥ Λ LSqmed(d) has very high sensitivity. Adding noise proportional to local sensitivity does not guarantee differential privacy
15 Lecture 7 : 590.03 Fall 12
Lecture 7 : 590.03 Fall 12 16
D1 D2 D3 D4 D5 D6 Local Sensitivity Global Sensitivity Smooth Sensitivity
[Nissim et al., STOC 2007] S(.) is a β-smooth upper bound on the local sensitivity if,
For all d, Sq(d) ≥ LSq(d) For all d, d’ differing in one entry, Sq(d) ≤ exp(β) Sq(d’)
S*q(d) = maxd’ ( LSq(d’) exp(-mβ) ) where d and d’ differ in m entries.
17 Lecture 7 : 590.03 Fall 12
S*qmed(d) = maxk (exp(-kβ) x max 5-k ≤med≤ 5+k(xmed+1 – xmed, xmed – xmed-1))
18 Lecture 7 : 590.03 Fall 12
For instance, Λ = 1000, β = 2. S*qmed(d) = max ( max0≤k≤4(exp(-β∙k) ∙ 1), max5≤k≤10 (exp(-β∙k) ∙ Λ) ) = 1
19 Lecture 7 : 590.03 Fall 12
Lecture 7 : 590.03 Fall 12 20
Theorem
P[f(D) O] ≤ eε P[f(D’) O] + δ for all D, D’ that differ in one entry, and for all outputs O.
Lecture 7 : 590.03 Fall 12 21
A(d) = q(d) + Z ∙ (S*q(x) /α)
22 Lecture 7 : 590.03 Fall 12
Lecture 7 : 590.03 Fall 12 23
– Local sensitivity is very sensitive. – Adding noise proportional to local sensitivity causes privacy breaches.
– Not sensitive. – Much smaller than global sensitivity.
24 Lecture 7 : 590.03 Fall 12
sensitivity of the function.
Lecture 7 : 590.03 Fall 12 25
Lecture 7 : 590.03 Fall 12 26
Original Data Sample without replacement Original Function New Aggregation Function
(assumed to be drawn i.i.d. from some distribution)
Solution:
Lecture 7 : 590.03 Fall 12 27
Solution:
Utility Theorem:
Lecture 7 : 590.03 Fall 12 28
Solution:
Privacy: Average is a deterministic algorithm. So does not guarantee differential privacy. (Add noise calibrated to sensitivity of average)
Lecture 7 : 590.03 Fall 12 29
– Round up the αk smallest values to zαk – Round down the αk largest values to z(1-α)k – Compute the mean on the new set of values.
– Sensitivity = |a-b|/kε
Lecture 7 : 590.03 Fall 12 30
when computing sensitivity is hard.
Lecture 7 : 590.03 Fall 12 31
Lecture 7 : 590.03 Fall 12 32
analysis”, TCC 2006
analysis”, STOC 2007
2011
Lecture 7 : 590.03 Fall 12 33