Quantum to Classical Randomness Extractors
Mario Berta, Omar Fawzi, Stephanie Wehner
- Full version preprint available at arXiv:
1111.2026v3
08/23/2012 - CRYPTO University of California, Santa Barbara
Quantum to Classical Randomness Extractors Mario Berta, Omar Fawzi, - - PowerPoint PPT Presentation
Quantum to Classical Randomness Extractors Mario Berta, Omar Fawzi, Stephanie Wehner - Full version preprint available at arXiv: 1111.2026v3 08/23/2012 - CRYPTO University of California, Santa Barbara Outline (Classical to Classical)
Mario Berta, Omar Fawzi, Stephanie Wehner
1111.2026v3
08/23/2012 - CRYPTO University of California, Santa Barbara
Extractors
Extractors
Extractors
Information
Extractors
Information
Given an (unknown) weak source of classical randomness, how to convert it into uniformly random bits?
N = N1, N2, . . . , Nq
Source
Given an (unknown) weak source of classical randomness, how to convert it into uniformly random bits?
N = N1, N2, . . . , Nq
Ex:
Pr[N1 = 0] = 1 2 + δ1, Pr[N2 = 0] = 1 2 + δ2, . . . Source
Given an (unknown) weak source of classical randomness, how to convert it into uniformly random bits?
N = N1, N2, . . . , Nq
Ex:
Pr[N1 = 0] = 1 2 + δ1, Pr[N2 = 0] = 1 2 + δ2, . . .
Function: f(N = N1, . . . , Nq) = M
Source
Given an (unknown) weak source of classical randomness, how to convert it into uniformly random bits?
N = N1, N2, . . . , Nq
Ex:
Pr[N1 = 0] = 1 2 + δ1, Pr[N2 = 0] = 1 2 + δ2, . . .
Function: f(N = N1, . . . , Nq) = M Ex:
Pr[Ni = 0] = 2 3 Pr[Ni = 1] = 1 3
Pr[M = 0] ≈ 0.52 M = f(N1N2N3) = N1 + N2 + N3 mod 2 Source
Given an (unknown) weak source of classical randomness, how to convert it into uniformly random bits? Only minimal guarantee about the randomness of the source, high min- entropy: .
N = N1, N2, . . . , Nq
Ex:
Pr[N1 = 0] = 1 2 + δ1, Pr[N2 = 0] = 1 2 + δ2, . . .
Function: f(N = N1, . . . , Nq) = M Ex:
Pr[Ni = 0] = 2 3 Pr[Ni = 1] = 1 3
Pr[M = 0] ≈ 0.52 M = f(N1N2N3) = N1 + N2 + N3 mod 2 Source
Hmin(N)P = − log max
n
PN(n) = − log pguess(N)P
Given an (unknown) weak source of classical randomness, how to convert it into uniformly random bits? Only minimal guarantee about the randomness of the source, high min- entropy: . Not possible to obtain randomness using a deterministic function, invest a small amount of perfect randomness:
N = N1, N2, . . . , Nq
Ex:
Pr[N1 = 0] = 1 2 + δ1, Pr[N2 = 0] = 1 2 + δ2, . . .
N
Seed D
M = fD(N)
fD
Function: f(N = N1, . . . , Nq) = M Ex:
Pr[Ni = 0] = 2 3 Pr[Ni = 1] = 1 3
Pr[M = 0] ≈ 0.52 M = f(N1N2N3) = N1 + N2 + N3 mod 2 Source
Hmin(N)P = − log max
n
PN(n) = − log pguess(N)P
Given an (unknown) weak source of classical randomness, how to convert it into uniformly random bits? Only minimal guarantee about the randomness of the source, high min- entropy: . Not possible to obtain randomness using a deterministic function, invest a small amount of perfect randomness:
N = N1, N2, . . . , Nq
Ex:
Pr[N1 = 0] = 1 2 + δ1, Pr[N2 = 0] = 1 2 + δ2, . . .
N
Seed D
M = fD(N)
fD
Lost randomness? Strong extractors: are jointly uniform. Function: f(N = N1, . . . , Nq) = M Ex:
Pr[Ni = 0] = 2 3 Pr[Ni = 1] = 1 3
Pr[M = 0] ≈ 0.52 M = f(N1N2N3) = N1 + N2 + N3 mod 2
(M, D)
Source
Hmin(N)P = − log max
n
PN(n) = − log pguess(N)P
Given an (unknown) weak source of classical randomness, how to convert it into uniformly random bits? Only minimal guarantee about the randomness of the source, high min- entropy: . Applications in information theory, cryptography and computational complexity theory [1,2].
[1] Nisan and Zuckerman, JCSS 52:43, 1996 [2] Vadhan, http://people.seas.harvard.edu/~salil/pseudorandomness/
Not possible to obtain randomness using a deterministic function, invest a small amount of perfect randomness:
N = N1, N2, . . . , Nq
Ex:
Pr[N1 = 0] = 1 2 + δ1, Pr[N2 = 0] = 1 2 + δ2, . . .
N
Seed D
M = fD(N)
fD
Lost randomness? Strong extractors: are jointly uniform. Function: f(N = N1, . . . , Nq) = M Ex:
Pr[Ni = 0] = 2 3 Pr[Ni = 1] = 1 3
Pr[M = 0] ≈ 0.52 M = f(N1N2N3) = N1 + N2 + N3 mod 2
(M, D)
Source
Hmin(N)P = − log max
n
PN(n) = − log pguess(N)P
Deal with prior knowledge (trivial for classical side information [3]), in general problematic for quantum side information [4]! Source described by classical-quantum (cq)- state: .
[3] König and Terhal, IEEE TIT 54:749, 2008 [4] Gavinsky et al., STOC, 2007
ρNE = X
n
pn|nihn|N ⌦ ρn
E
Deal with prior knowledge (trivial for classical side information [3]), in general problematic for quantum side information [4]! Source described by classical-quantum (cq)- state: .
[3] König and Terhal, IEEE TIT 54:749, 2008 [4] Gavinsky et al., STOC, 2007
ρNE = X
n
pn|nihn|N ⌦ ρn
E
N M Mix Discard D D
ρNE = X
n
pn|nihn|N ⌦ ρn
E
idD D
fD
kρMED idM M ⌦ ρEDk1 ε
E E kXk1 = tr[ p X†X]
Deal with prior knowledge (trivial for classical side information [3]), in general problematic for quantum side information [4]! Source described by classical-quantum (cq)- state: .
[3] König and Terhal, IEEE TIT 54:749, 2008 [4] Gavinsky et al., STOC, 2007
ρNE = X
n
pn|nihn|N ⌦ ρn
E
Guarantee about conditional min-entropy of the source: .
Hmin(N|E)ρ = − log pguess(N|E)ρ
N M Mix Discard D D
ρNE = X
n
pn|nihn|N ⌦ ρn
E
idD D
fD
kρMED idM M ⌦ ρEDk1 ε
E E kXk1 = tr[ p X†X]
Deal with prior knowledge (trivial for classical side information [3]), in general problematic for quantum side information [4]! Source described by classical-quantum (cq)- state: .
[3] König and Terhal, IEEE TIT 54:749, 2008 [4] Gavinsky et al., STOC, 2007
ρNE = X
n
pn|nihn|N ⌦ ρn
E
Guarantee about conditional min-entropy of the source: .
Hmin(N|E)ρ = − log pguess(N|E)ρ
N M Mix Discard D D
ρNE = X
n
pn|nihn|N ⌦ ρn
E
idD D
fD
Ex: Two-universal hashing / privacy amplification [5]. For all cq-states with , we have for . Strong extractor (against quantum side information), .
[5] Renner, PhD Thesis, ETHZ, 2005
kρMED idM M ⌦ ρEDk1 ε
ρNE
Hmin(N|E)ρ ≥ k
kρMED idM M ⌦ ρEDk1 ε
(k, ε)
D = O(N)
E E kXk1 = tr[ p X†X]
M = 2k · ε2
Motivation: How to get weak randomness at first? How much randomness can be gained from a quantum source?
Motivation: How to get weak randomness at first? How much randomness can be gained from a quantum source?
N
Measure
M Mix Discard D D idD D N
ρNE
MD
M
Measurement
kρMED idM M ⌦ ρEDk1 ε
E E
Motivation: How to get weak randomness at first? How much randomness can be gained from a quantum source?
N
Measure
M Mix Discard D D idD D N
ρNE
MD
M
Idea: Same setup as in the classical case (no control of the source)! Only guarantee about the conditional min-entropy [6]:
Measurement
kρMED idM M ⌦ ρEDk1 ε
Hmin(N|E)ρ = − log N max
ΛE!N0 F(ΦNN 0, (idN ⊗ ΛE→N 0)(ρNE))
|ΦiNN 0 = 1 p N
N
X
n=1
|niN ⌦ |niN 0
F(ρ, σ) = kpρpσk2
1
[6] König et al., IEEE TIT 55:4674, 2009 E E
Motivation: How to get weak randomness at first? How much randomness can be gained from a quantum source?
N
Measure
M Mix Discard D D idD D N
ρNE
MD
M
Idea: Same setup as in the classical case (no control of the source)! Only guarantee about the conditional min-entropy [6]:
Measurement
kρMED idM M ⌦ ρEDk1 ε
Can get negative for entangled input states, in fact for MES: .
Hmin(N|E)ρ = − log N max
ΛE!N0 F(ΦNN 0, (idN ⊗ ΛE→N 0)(ρNE))
|ΦiNN 0 = 1 p N
N
X
n=1
|niN ⌦ |niN 0
F(ρ, σ) = kpρpσk2
1
Hmin(N|E)Φ = − log N
[6] König et al., IEEE TIT 55:4674, 2009 E E
Mix Disc. Meas.
(M, N\M)
M D N
Mix Disc. Meas.
(M, N\M)
Ud
M
τN→M(.) = X
m∈M,j∈N\M
hmj|(.)|mji|mihm|M
D N
Definition: A set of unitaries defines a strong qc-extractor (against quantum side information) if for any state with , ρNE
Hmin(N|E)ρ ≥ k
(k, ε)
{U1, . . . , UD}
Mix Disc. Meas.
(M, N\M)
Ud
M
τN→M(.) = X
m∈M,j∈N\M
hmj|(.)|mji|mihm|M
D
k 1 D
D
X
i=1
τN→M(UiρNEU †
i ) ⌦ |iihi|D idM
M ⌦ ρEDk1 ε .
N
Definition: A set of unitaries defines a strong qc-extractor (against quantum side information) if for any state with , ρNE
Hmin(N|E)ρ ≥ k
(k, ε)
{U1, . . . , UD}
Without side information, this corresponds to -metric uncertainty relations [7].
[7] Fawzi et al., STOC, 2011 Mix Disc. Meas.
(M, N\M)
Ud
M
τN→M(.) = X
m∈M,j∈N\M
hmj|(.)|mji|mihm|M
D
k 1 D
D
X
i=1
τN→M(UiρNEU †
i ) ⌦ |iihi|D idM
M ⌦ ρEDk1 ε .
ε
N
Definition: A set of unitaries defines a strong qc-extractor (against quantum side information) if for any state with , ρNE
Hmin(N|E)ρ ≥ k
(k, ε)
{U1, . . . , UD}
Without side information, this corresponds to -metric uncertainty relations [7]. Fully quantum versions of this: decoupling theorems (quantum coding theory) [8], quantum state randomization [9], quantum extractors [10]: quantum to quantum (qq)-randomness extractors!
[7] Fawzi et al., STOC, 2011 [8] Dupuis, PhD Thesis, McGill, 2009 [9] Hayden et al., CMP 250:371, 2004 [10] Ben-Aroya et al., TOC 6:47, 2010 Mix Disc. Meas. N
(M, N\M)
Ud
M
τN→M(.) = X
m∈M,j∈N\M
hmj|(.)|mji|mihm|M
D
k 1 D
D
X
i=1
τN→M(UiρNEU †
i ) ⌦ |iihi|D idM
M ⌦ ρEDk1 ε .
ε
N M
Mix Disc. Meas.
(M, N\M)
M D N
Ui
Output size: Seed size:
D = M · log N · ε−4 M = min{N, N · 2k · ✏4}
Mix Disc. Meas.
(M, N\M)
M D N
Ui
Output size: Seed size:
D = M · log N · ε−4 M = min{N, N · 2k · ✏4}
Mix Disc. Meas.
(M, N\M)
M D N
Ui
Output size: Seed size:
Output size: , where Seed size: D ≥ ε−1
D = M · log N · ε−4 M = min{N, N · 2k · ✏4}
M ≤ N · 2kε 2kε = Hε
min(N|E)ρ =
max
¯ ρ∈Bε(ρ) Hmin(N|E)¯ ρ
(smooth entropies [5,11]).
[5] Renner, PhD Thesis, ETHZ, 2005 [11] Tomamichel, PhD Thesis ETHZ, 2012 Mix Disc. Meas.
(M, N\M)
M D N
Ui
Output size: Seed size:
Output size: , where Seed size: D ≥ ε−1
D = M · log N · ε−4 M = min{N, N · 2k · ✏4}
M ≤ N · 2kε 2kε = Hε
min(N|E)ρ =
max
¯ ρ∈Bε(ρ) Hmin(N|E)¯ ρ
(smooth entropies [5,11]).
[5] Renner, PhD Thesis, ETHZ, 2005 [11] Tomamichel, PhD Thesis ETHZ, 2012
Huge gap! We know that our proof technique can only yield
D ≥ ε−2 · min{N · 2−k−1, M/4} [12].
[12] Fawzi, PhD Thesis, McGill, 2012 Mix Disc. Meas.
(M, N\M)
M D N
Ui
Output size: Seed size:
Output size: , where Seed size: D ≥ ε−1
D = M · log N · ε−4 M = min{N, N · 2k · ✏4}
M ≤ N · 2kε 2kε = Hε
min(N|E)ρ =
max
¯ ρ∈Bε(ρ) Hmin(N|E)¯ ρ
(smooth entropies [5,11]).
[5] Renner, PhD Thesis, ETHZ, 2005 [11] Tomamichel, PhD Thesis ETHZ, 2012
Huge gap! We know that our proof technique can only yield
D ≥ ε−2 · min{N · 2−k−1, M/4} [12].
[12] Fawzi, PhD Thesis, McGill, 2012
Mix Disc. Meas.
(M, N\M)
M D N
Ui
Mix Disc. Meas.
(M, N\M)
M D N
Ui
(Almost) unitary two-designs reproduce second moment of random unitaries [8,13]:
[8] Dupuis, PhD Thesis, McGill, 2009 [13] Szehr et al., arXiv:1109.4348v1
D = O(N 4)
M = min{N, N · 2k · ✏2}
Mix Disc. Meas.
(M, N\M)
M D N
Ui
(Almost) unitary two-designs reproduce second moment of random unitaries [8,13]:
[8] Dupuis, PhD Thesis, McGill, 2009 [13] Szehr et al., arXiv:1109.4348v1
Set of unitaries defined by a full set of mutually unbiased bases together with two-wise independent permutations:
D = O(N 4)
M = min{N, N · 2k · ✏2} M = min{N, N · 2k · ✏2} D = N · (N + 1)2
Mix Disc. Meas.
(M, N\M)
M D N
Ui
(Almost) unitary two-designs reproduce second moment of random unitaries [8,13]:
[8] Dupuis, PhD Thesis, McGill, 2009 [13] Szehr et al., arXiv:1109.4348v1
Set of unitaries defined by a full set of mutually unbiased bases together with two-wise independent permutations: Bitwise qc-extractors! Let . Set of unitaries defined by a full set of mutually unbiased bases for each qubit, , together with two-wise independent permutations: N = 2n, M = 2m
{σX, σY , σZ}⊗n D = O(N 4)
M = min{N, N · 2k · ✏2} M = min{N, N · 2k · ✏2} D = N · (N + 1)2
Mix Disc. Meas.
(M, N\M)
M D N
Ui M = O(N log 3−1 · ε4) · min{1, 2k}
D = N · (N − 1) · 3log N
f
x y f(x,y)
Should not learn y Should not learn x
f
x y f(x,y)
??? ???
[17] Lo, PRA 56:1154, 1997
Should not learn y Should not learn x
f
x y f(x,y) Not possible to solve without assumptions [17].
??? ???
[17] Lo, PRA 56:1154, 1997
Should not learn y Should not learn x
f
x y f(x,y) Not possible to solve without assumptions [17]. Classical assumptions are typically computational assumptions (e.g. factoring is hard).
??? ???
[17] Lo, PRA 56:1154, 1997 [18] Damgård et al., CRYPTO, 2007
Should not learn y Should not learn x
f
x y f(x,y)
[19] König et al., IEEE TIT 58:1962, 2012
Not possible to solve without assumptions [17]. Classical assumptions are typically computational assumptions (e.g. factoring is hard). Physical assumption: bounded quantum storage [18], secure function evaluation becomes possible [19].
??? ???
What the adversary can do: computationally all powerful, unlimited classical storage, actions are instantaneous, BUT noisy (bounded) quantum storage.
[20] Wehner et al., PRL 100:220502, 2008
What the adversary can do: computationally all powerful, unlimited classical storage, actions are instantaneous, BUT noisy (bounded) quantum storage.
[20] Wehner et al., PRL 100:220502, 2008
What the adversary can do: computationally all powerful, unlimited classical storage, actions are instantaneous, BUT noisy (bounded) quantum storage. Basic idea: protocol will have have waiting times, in which noisy storage must be used!
[20] Wehner et al., PRL 100:220502, 2008
[20] Wehner et al., PRL 100:220502, 2008 [21] Kilian, STOC, 1998 [19] König et al., IEEE TIT 58:1962, 2012 [22] B. et al., IEEE ISIT, 2012
What the adversary can do: computationally all powerful, unlimited classical storage, actions are instantaneous, BUT noisy (bounded) quantum storage. Basic idea: protocol will have have waiting times, in which noisy storage must be used! Implement task ‘weak string erasure’ (sufficient [21]). Using bitwise qc-randomness extractors, we can link security to the entanglement fidelity (quantum capacity) of the noisy quantum storage (improves [19,22])!
[14] Wehner and Winter., NJP 12:025009, 2010
Review article [14]. Given a quantum state and a set of measurements these relations usually take the form (where denotes e.g. the Shannon entropy):
ρ
{K1, . . . , KD} H(.)
H(K|D) = 1 D
D
X
i=1
H(Ki|D = i) ≥ const(K) .
[14] Wehner and Winter., NJP 12:025009, 2010 [15] B. et al., NP 6:659, 2010
Review article [14]. Given a quantum state and a set of measurements these relations usually take the form (where denotes e.g. the Shannon entropy):
ρ
{K1, . . . , KD}
Idea of [15]: add quantum side information! Start with a bipartite quantum state and a set of measurements on A:
H(.)
ρAE
{K1, . . . , KD}
here , the von Neumann entropy, and its conditional version
H(A)ρ = −tr[ρA log ρA]
H(A|B)ρ = H(AB)ρ − H(B)ρ (which can get negative for entangled input states!). H(K|D) = 1 D
D
X
i=1
H(Ki|D = i) ≥ const(K) .
H(K|ED) = 1 D
D
X
i=1
H(Ki|ED = i) ≥ const(K) + H(A|E),
[14] Wehner and Winter., NJP 12:025009, 2010 [15] B. et al., NP 6:659, 2010
Review article [14]. Given a quantum state and a set of measurements these relations usually take the form (where denotes e.g. the Shannon entropy):
ρ
{K1, . . . , KD}
Idea of [15]: add quantum side information! Start with a bipartite quantum state and a set of measurements on A:
[16] B./Wehner/Coles, unpublished
H(.)
ρAE
{K1, . . . , KD}
here , the von Neumann entropy, and its conditional version
H(A)ρ = −tr[ρA log ρA]
H(A|B)ρ = H(AB)ρ − H(B)ρ (which can get negative for entangled input states!).
QC-extractors (against quantum side information) give entropic uncertainty relations with quantum side information! Entropic uncertainty relations with quantum side information together with cc- extractors give qc-extractors (against quantum side information) [16]!
H(K|D) = 1 D
D
X
i=1
H(Ki|D = i) ≥ const(K) .
H(K|ED) = 1 D
D
X
i=1
H(Ki|ED = i) ≥ const(K) + H(A|E),
Definition of quantum to classical (qc)-randomness extractors. Probabilistic and explicit constructions as well as converse bounds. Close relation to entropic uncertainty relations with quantum side information. Security in the noisy-storage model linked to the quantum capacity.
Relation between qq-, qc-, and cc-extractors? Definition of quantum to classical (qc)-randomness extractors. Probabilistic and explicit constructions as well as converse bounds. Close relation to entropic uncertainty relations with quantum side information. Security in the noisy-storage model linked to the quantum capacity.
Seed length: . We believe that at least might be possible (cf. cc-extractors against quantum side information [23]). However, our proof technique can only yield [12]. Relation between qq-, qc-, and cc-extractors? Definition of quantum to classical (qc)-randomness extractors. Probabilistic and explicit constructions as well as converse bounds.
[23] Ve et al., arXiv:0912.5514v3 [12] Fawzi, PhD Thesis, McGill, 2012
Close relation to entropic uncertainty relations with quantum side information. Security in the noisy-storage model linked to the quantum capacity.
D = polylog(N)
D ≥ ε−2 · min{N · 2−k−1, M/4}
ε−1 ≤ D ≤ M · log N · ε−4
Seed length: . We believe that at least might be possible (cf. cc-extractors against quantum side information [23]). However, our proof technique can only yield [12]. Relation between qq-, qc-, and cc-extractors? Bitwise qc-randomness extractor for (BB84) encoding? Improve bound for (six-state) encoding for large n? Definition of quantum to classical (qc)-randomness extractors. Probabilistic and explicit constructions as well as converse bounds.
[23] Ve et al., arXiv:0912.5514v3 [12] Fawzi, PhD Thesis, McGill, 2012
Close relation to entropic uncertainty relations with quantum side information. Security in the noisy-storage model linked to the quantum capacity.
{σX, σZ}⊗n
D = polylog(N)
D ≥ ε−2 · min{N · 2−k−1, M/4}
{σX, σY , σZ}⊗n
ε−1 ≤ D ≤ M · log N · ε−4