Time space tradeoffs for attacks against one-way functions and PRGs - - PowerPoint PPT Presentation

time space tradeoffs for attacks against one way
SMART_READER_LITE
LIVE PREVIEW

Time space tradeoffs for attacks against one-way functions and PRGs - - PowerPoint PPT Presentation

Time space tradeoffs for attacks against one-way functions and PRGs Anindya De University of California, Berkeley Joint work with Luca Trevisan - UC Berkeley and Stanford University Madhur Tulsiani - Princeton University 0 / 26 What is this


slide-1
SLIDE 1

Time space tradeoffs for attacks against

  • ne-way functions and PRGs

Anindya De University of California, Berkeley

Joint work with Luca Trevisan - UC Berkeley and Stanford University Madhur Tulsiani - Princeton University

0 / 26

slide-2
SLIDE 2

What is this talk about?

  • Can “brute-force” attacks on cryptographic primitives be

improved upon?

1 / 26

slide-3
SLIDE 3

What is this talk about?

  • Can “brute-force” attacks on cryptographic primitives be

improved upon?

  • Recover a key of length k in time less than 2k.

1 / 26

slide-4
SLIDE 4

What is this talk about?

  • Can “brute-force” attacks on cryptographic primitives be

improved upon?

  • Recover a key of length k in time less than 2k.
  • In time t, recover key with probability better than t/2k.

1 / 26

slide-5
SLIDE 5

What is this talk about?

  • Can “brute-force” attacks on cryptographic primitives be

improved upon?

  • Recover a key of length k in time less than 2k.
  • In time t, recover key with probability better than t/2k.
  • Brute force : optimal when restricted to uniform algorithms

1 / 26

slide-6
SLIDE 6

What is this talk about?

  • Can “brute-force” attacks on cryptographic primitives be

improved upon?

  • Recover a key of length k in time less than 2k.
  • In time t, recover key with probability better than t/2k.
  • Brute force : optimal when restricted to uniform algorithms
  • Are better (non-uniform) attacks possible against:
  • one-way functions?
  • pseudo-random generators?

1 / 26

slide-7
SLIDE 7

Definitions of primitives

  • N = 2n, [N] ∼

= {0, 1}n.

2 / 26

slide-8
SLIDE 8

Definitions of primitives

  • N = 2n, [N] ∼

= {0, 1}n.

  • One-way function: f : [N] → [N] is (t, ǫ)-one way if for every

algorithm A of complexity ≤ t Pr

x∼{0,1}n

  • Af(f(x)) = x′ | f(x′) = f(x)
  • ≤ ǫ

2 / 26

slide-9
SLIDE 9

Definitions of primitives

  • N = 2n, [N] ∼

= {0, 1}n.

  • One-way function: f : [N] → [N] is (t, ǫ)-one way if for every

algorithm A of complexity ≤ t Pr

x∼{0,1}n

  • Af(f(x)) = x′ | f(x′) = f(x)
  • ≤ ǫ
  • PRG: G : [N] → [2N] is a (t, ǫ)-secure PRG if for every

algorithm A of complexity ≤ t

  • Pr

x∼[N][AG(G(x)) = 1] −

Pr

y∼[2N][AG(y) = 1]

  • ≤ ǫ

2 / 26

slide-10
SLIDE 10

Measure of Complexity

  • complexity = time, as A may compute f −1 in O(log N) time

by storing all inverses.

3 / 26

slide-11
SLIDE 11

Measure of Complexity

  • complexity = time, as A may compute f −1 in O(log N) time

by storing all inverses.

  • complexity = pre-computed advice + running time.

3 / 26

slide-12
SLIDE 12

Measure of Complexity

  • complexity = time, as A may compute f −1 in O(log N) time

by storing all inverses.

  • complexity = pre-computed advice + running time.
  • Can be implemented on a RAM machine with time and

space t.

  • Similar to circuit complexity.

3 / 26

slide-13
SLIDE 13

Upper bounds

Primitive Complexity [Hellman 80] Permutation f ˜ O( √ N)

4 / 26

slide-14
SLIDE 14

Upper bounds

Primitive Complexity [Hellman 80] Permutation f ˜ O( √ N) [Hellman 80] Random function f (heuristic) ˜ O(N2/3)

4 / 26

slide-15
SLIDE 15

Upper bounds

Primitive Complexity [Hellman 80] Permutation f ˜ O( √ N) [Hellman 80] Random function f (heuristic) ˜ O(N2/3) [Fiat-Naor 99] Any f, all inputs ˜ O(N3/4)

4 / 26

slide-16
SLIDE 16

Upper bounds

Primitive Complexity [Hellman 80] Permutation f ˜ O( √ N) [Hellman 80] Random function f (heuristic) ˜ O(N2/3) [Fiat-Naor 99] Any f, all inputs ˜ O(N3/4) [DTT 10] Any f, ǫ-fraction of inputs ˜ O( √ ǫN) ǫ ≤ N−1/3 ˜ O(ǫ5/4N3/4) ǫ ≥ N−1/3

4 / 26

slide-17
SLIDE 17

Upper bounds

Primitive Complexity [Hellman 80] Permutation f ˜ O( √ N) [Hellman 80] Random function f (heuristic) ˜ O(N2/3) [Fiat-Naor 99] Any f, all inputs ˜ O(N3/4) [DTT 10] Any f, ǫ-fraction of inputs ˜ O( √ ǫN) ǫ ≤ N−1/3 ˜ O(ǫ5/4N3/4) ǫ ≥ N−1/3 [ACR 97] PRG G(x)

def

= (f(x), P(x)) ˜ O(ǫ2N)

4 / 26

slide-18
SLIDE 18

Upper bounds

Primitive Complexity [Hellman 80] Permutation f ˜ O( √ N) [Hellman 80] Random function f (heuristic) ˜ O(N2/3) [Fiat-Naor 99] Any f, all inputs ˜ O(N3/4) [DTT 10] Any f, ǫ-fraction of inputs ˜ O( √ ǫN) ǫ ≤ N−1/3 ˜ O(ǫ5/4N3/4) ǫ ≥ N−1/3 [ACR 97] PRG G(x)

def

= (f(x), P(x)) ˜ O(ǫ2N) [DTT 10] Any PRG ˜ O(ǫ2N)

4 / 26

slide-19
SLIDE 19

Upper bounds

Primitive Complexity [Hellman 80] Permutation f ˜ O( √ N) [Hellman 80] Random function f (heuristic) ˜ O(N2/3) [Fiat-Naor 99] Any f, all inputs ˜ O(N3/4) [DTT 10] Any f, ǫ-fraction of inputs ˜ O( √ ǫN) ǫ ≤ N−1/3 ˜ O(ǫ5/4N3/4) ǫ ≥ N−1/3 [ACR 97] PRG G(x)

def

= (f(x), P(x)) ˜ O(ǫ2N) [DTT 10] Any PRG ˜ O(ǫ2N) All above results are actually stated as time-space tradeoffs. Complexity is

  • ptimized when T = S.

4 / 26

slide-20
SLIDE 20

Lower bounds

Better stated in terms of a tradeoff between T and S.

5 / 26

slide-21
SLIDE 21

Lower bounds

Better stated in terms of a tradeoff between T and S. Primitive Tradeoff [Yao 90] [Gennaro-Trevisan 00] [Wee 05] Permutation f, ǫ-fraction

  • f inputs

T · S = ˜ Ω(ǫN) for T = O( √ ǫN)

5 / 26

slide-22
SLIDE 22

Lower bounds

Better stated in terms of a tradeoff between T and S. Primitive Tradeoff [Yao 90] [Gennaro-Trevisan 00] [Wee 05] Permutation f, ǫ-fraction

  • f inputs

T · S = ˜ Ω(ǫN) for T = O( √ ǫN) [DTT 10] Permutation f, ǫ-fraction

  • f inputs

T · S = ˜ Ω(ǫN) for any T

5 / 26

slide-23
SLIDE 23

Lower bounds

Better stated in terms of a tradeoff between T and S. Primitive Tradeoff [Yao 90] [Gennaro-Trevisan 00] [Wee 05] Permutation f, ǫ-fraction

  • f inputs

T · S = ˜ Ω(ǫN) for T = O( √ ǫN) [DTT 10] Permutation f, ǫ-fraction

  • f inputs

T · S = ˜ Ω(ǫN) for any T [DTT 10] PRG G

def

= (f(x), P(x)) T · S = Ω(ǫ2N)

5 / 26

slide-24
SLIDE 24

Hellman’s approach for permutations

f(x)

slide-25
SLIDE 25

Hellman’s approach for permutations

f(x) f(f(x))

slide-26
SLIDE 26

Hellman’s approach for permutations

f(x) f(f(x)) f(f(f(x)))

slide-27
SLIDE 27

Hellman’s approach for permutations

f(x) f(f(x)) f(f(f(x))) x

slide-28
SLIDE 28

Hellman’s approach for permutations

f(x) f(f(x)) f(f(f(x))) x In small cycles of size less than √ N, compute f(x), f(f(x)), . . .

6 / 26

slide-29
SLIDE 29

Hellman’s approach for permutations

f(x) f(f(x)) f(f(f(x))) x In small cycles of size less than √ N, compute f(x), f(f(x)), . . . At some point, you hit x. f −1(x) is the penultimate point in the sequence.

6 / 26

slide-30
SLIDE 30

Hellman’s approach for permutations

f(x) f(f(x)) f(f(f(x))) x In small cycles of size less than √ N, compute f(x), f(f(x)), . . . At some point, you hit x. f −1(x) is the penultimate point in the sequence. Time complexity of computation is ˜ O( √ N).

6 / 26

slide-31
SLIDE 31

What happens to large cycles?

√ N a b x c d In large cycles, store back-links at a distance of √ N

7 / 26

slide-32
SLIDE 32

What happens to large cycles?

√ N a b x c d In large cycles, store back-links at a distance of √ N For e.g., store (a, b), (b, c), (c, d) and (d, a) in a data-structure

7 / 26

slide-33
SLIDE 33

What happens to large cycles?

√ N a b x c d Compute f(x), f(f(x)), . . . till you hit a point in the data structure, say a

8 / 26

slide-34
SLIDE 34

What happens to large cycles?

√ N a b x c d Compute f(x), f(f(x)), . . . till you hit a point in the data structure, say a When you hit a, use back-link to go back to b

8 / 26

slide-35
SLIDE 35

What happens to large cycles?

√ N a b x c d Now, compute f(a), f(f(a)), . . . until you hit x

9 / 26

slide-36
SLIDE 36

What happens to large cycles?

√ N a b x c d Now, compute f(a), f(f(a)), . . . until you hit x The penultimate point in the sequence is f −1(x)

9 / 26

slide-37
SLIDE 37

What happens to large cycles?

√ N a b x c d Note that all the cycles can be covered by O( √ N) back-links (each back-link covering a distance of √ N)

10 / 26

slide-38
SLIDE 38

What happens to large cycles?

√ N a b x c d Note that all the cycles can be covered by O( √ N) back-links (each back-link covering a distance of √ N) Also, the total time complexity is √ N as you hit a “back-link” in that time

10 / 26

slide-39
SLIDE 39

Time and space complexity for inverting permutations

  • Total time T = ˜

O( √ N) and space S = ˜ O( √ N).

11 / 26

slide-40
SLIDE 40

Time and space complexity for inverting permutations

  • Total time T = ˜

O( √ N) and space S = ˜ O( √ N).

  • Can be used to invert ǫ fraction of the elements in time

T = ˜ O( √ ǫN) and space S = ˜ O( √ ǫN)

  • In fact, we can achieve any time (T) space (S) tradeoff

such that T · S = ǫN.

11 / 26

slide-41
SLIDE 41

Abstracting the approach for permutations

  • Cover the graph (x → f(x)) of f by m disjoint paths of

length ℓ.

12 / 26

slide-42
SLIDE 42

Abstracting the approach for permutations

  • Cover the graph (x → f(x)) of f by m disjoint paths of

length ℓ.

  • Gives algo with T = ˜

O(ℓ) and S = ˜ O(m) (one back-link per path).

12 / 26

slide-43
SLIDE 43

Abstracting the approach for permutations

  • Cover the graph (x → f(x)) of f by m disjoint paths of

length ℓ.

  • Gives algo with T = ˜

O(ℓ) and S = ˜ O(m) (one back-link per path).

  • Problem: m may have to be very large because the graph

(x → f(x)) may not have many long and disjoint paths.

12 / 26

slide-44
SLIDE 44

Approach for random functions [Hellman, Fiat-Naor]

  • Collision probability: λ = Prx,x′∼[N] [f(x) = f(x′)].

13 / 26

slide-45
SLIDE 45

Approach for random functions [Hellman, Fiat-Naor]

  • Collision probability: λ = Prx,x′∼[N] [f(x) = f(x′)].
  • If h is a (known) permutation, then inverting h ◦ f suffices. If h is random

and f has low collision probability, then h ◦ f has many long paths which are pairwise disjoint.

13 / 26

slide-46
SLIDE 46

Approach for random functions [Hellman, Fiat-Naor]

  • Collision probability: λ = Prx,x′∼[N] [f(x) = f(x′)].
  • If h is a (known) permutation, then inverting h ◦ f suffices. If h is random

and f has low collision probability, then h ◦ f has many long paths which are pairwise disjoint.

1 2 5 8 3 4 11 6 7 14 9 10 17 12 13 20 15 16 23 18 19 26 21 22 29 24 25 32 27 28 35 30 31 38 33 34 41 36 37 44 39 40 47 42 43 50 45 46 53 48 49 56 51 52 59 54 55 57 58 20 13 1 44 59 2 29 4 3 28 53 14 5 11 26 6 40 52 7 56 31 8 41 9 58 10 17 50 12 15 22 16 18 19 43 37 21 34 23 24 46 27 38 30 49 32 33 35 47 36 39 55 42 45 48 51 54 57

f h ◦ f

13 / 26

slide-47
SLIDE 47

Inverting random functions (λ ≈ 1/N)

  • For independent random permutations h1, . . . , hr, let gi = hi ◦ f.

14 / 26

slide-48
SLIDE 48

Inverting random functions (λ ≈ 1/N)

  • For independent random permutations h1, . . . , hr, let gi = hi ◦ f.
  • For each gi, can find m disjoint paths of length ℓ as long as

m · ℓ2 · λ ≪ 1. (each gi inverts m · ℓ elements).

14 / 26

slide-49
SLIDE 49

Inverting random functions (λ ≈ 1/N)

  • For independent random permutations h1, . . . , hr, let gi = hi ◦ f.
  • For each gi, can find m disjoint paths of length ℓ as long as

m · ℓ2 · λ ≪ 1. (each gi inverts m · ℓ elements).

  • If g′

i s behave independently, elements inverted by each of them

are independent. Overall O(m · ℓ · r) elements inverted.

14 / 26

slide-50
SLIDE 50

Inverting random functions (λ ≈ 1/N)

  • For independent random permutations h1, . . . , hr, let gi = hi ◦ f.
  • For each gi, can find m disjoint paths of length ℓ as long as

m · ℓ2 · λ ≪ 1. (each gi inverts m · ℓ elements).

  • If g′

i s behave independently, elements inverted by each of them

are independent. Overall O(m · ℓ · r) elements inverted.

  • Choose m, ℓ, r = ˜

O(N1/3). T = O(ℓ · r) = ˜ O(N2/3) S = O(m · r) = ˜ O(N2/3)

14 / 26

slide-51
SLIDE 51

Inverting random functions (λ ≈ 1/N)

  • For independent random permutations h1, . . . , hr, let gi = hi ◦ f.
  • For each gi, can find m disjoint paths of length ℓ as long as

m · ℓ2 · λ ≪ 1. (each gi inverts m · ℓ elements).

  • If g′

i s behave independently, elements inverted by each of them

are independent. Overall O(m · ℓ · r) elements inverted.

  • Choose m, ℓ, r = ˜

O(N1/3). T = O(ℓ · r) = ˜ O(N2/3) S = O(m · r) = ˜ O(N2/3)

  • Problems: Computing h1, . . . , hr is hard. Heuristic works only for

random f.

14 / 26

slide-52
SLIDE 52

Inverting arbitrary functions [Fiat-Naor]

  • Store a table of K elements with many pre-images. Collision

probability restricted to the remaining inputs is ≈ 1/K.

15 / 26

slide-53
SLIDE 53

Inverting arbitrary functions [Fiat-Naor]

  • Store a table of K elements with many pre-images. Collision

probability restricted to the remaining inputs is ≈ 1/K.

  • Each hi only needs to be an ℓ-wise independent hash function.

Also, h1, . . . , hr only need to be pairwise independent.

15 / 26

slide-54
SLIDE 54

Inverting arbitrary functions [Fiat-Naor]

  • Store a table of K elements with many pre-images. Collision

probability restricted to the remaining inputs is ≈ 1/K.

  • Each hi only needs to be an ℓ-wise independent hash function.

Also, h1, . . . , hr only need to be pairwise independent.

  • Amortize time for one evauation each of h1, . . . , hr to ˜

O(ℓ + r). T = (time to compute h1, . . . , hr) · ℓ = ˜ O(ℓ2 + ℓ · r) S = ˜ O(K + m · r)

15 / 26

slide-55
SLIDE 55

Inverting arbitrary functions [Fiat-Naor]

  • Store a table of K elements with many pre-images. Collision

probability restricted to the remaining inputs is ≈ 1/K.

  • Each hi only needs to be an ℓ-wise independent hash function.

Also, h1, . . . , hr only need to be pairwise independent.

  • Amortize time for one evauation each of h1, . . . , hr to ˜

O(ℓ + r). T = (time to compute h1, . . . , hr) · ℓ = ˜ O(ℓ2 + ℓ · r) S = ˜ O(K + m · r)

  • Can again choose m, l such that mℓ2λ ≈ mℓ2/K ≪ 1. Can get

T, S = ˜ O(N3/4) by taking K = ˜ O(N3/4), r = ˜ O(N1/2) and m, ℓ = ˜ O(N1/4).

15 / 26

slide-56
SLIDE 56

Inverting f on ǫ-fraction of inputs

  • Directly scaling the Fiat-Naor result would give complexity

(ǫN)3/4 (we claimed ǫ5/4N3/4). Improved analysis using two simple ideas.

16 / 26

slide-57
SLIDE 57

Inverting f on ǫ-fraction of inputs

  • Directly scaling the Fiat-Naor result would give complexity

(ǫN)3/4 (we claimed ǫ5/4N3/4). Improved analysis using two simple ideas.

  • First observation: If a table of size K does not invert f with

probability ǫ, then the collision probability for the rest is ǫ/K.

16 / 26

slide-58
SLIDE 58

Inverting f on ǫ-fraction of inputs

  • Directly scaling the Fiat-Naor result would give complexity

(ǫN)3/4 (we claimed ǫ5/4N3/4). Improved analysis using two simple ideas.

  • First observation: If a table of size K does not invert f with

probability ǫ, then the collision probability for the rest is ǫ/K.

  • Second Observation: The number of elements inverted by a

path is not just the path length, but the the sum of indegrees of elements in the path.

f(x) x x′

16 / 26

slide-59
SLIDE 59

Issues in analysis

  • Problem: Probablities for inversion of moderately high indegree

elements do not add up (with our parameters) in the graphs for g1, . . . , gr. Also, these may not be in the table.

17 / 26

slide-60
SLIDE 60

Issues in analysis

  • Problem: Probablities for inversion of moderately high indegree

elements do not add up (with our parameters) in the graphs for g1, . . . , gr. Also, these may not be in the table.

  • Analyze these separately using a weaker bound.
  • Either weaker bound suffices or get better control on collision

probability.

17 / 26

slide-61
SLIDE 61

Issues in analysis

  • Problem: Probablities for inversion of moderately high indegree

elements do not add up (with our parameters) in the graphs for g1, . . . , gr. Also, these may not be in the table.

  • Analyze these separately using a weaker bound.
  • Either weaker bound suffices or get better control on collision

probability.

  • Problem: Value of r is O(1) for some ranges of ǫ and amortization over

evaluations of h1, . . . , hr is not possible.

17 / 26

slide-62
SLIDE 62

Issues in analysis

  • Problem: Probablities for inversion of moderately high indegree

elements do not add up (with our parameters) in the graphs for g1, . . . , gr. Also, these may not be in the table.

  • Analyze these separately using a weaker bound.
  • Either weaker bound suffices or get better control on collision

probability.

  • Problem: Value of r is O(1) for some ranges of ǫ and amortization over

evaluations of h1, . . . , hr is not possible.

  • Use better construction based on lossless expanders of Capalbo

et al. [CRVW02] and an observation of Seigel [Seigel89].

  • Take ℓo(1) time per evaluation.

17 / 26

slide-63
SLIDE 63

Issues in analysis

  • Problem: Probablities for inversion of moderately high indegree

elements do not add up (with our parameters) in the graphs for g1, . . . , gr. Also, these may not be in the table.

  • Analyze these separately using a weaker bound.
  • Either weaker bound suffices or get better control on collision

probability.

  • Problem: Value of r is O(1) for some ranges of ǫ and amortization over

evaluations of h1, . . . , hr is not possible.

  • Use better construction based on lossless expanders of Capalbo

et al. [CRVW02] and an observation of Seigel [Seigel89].

  • Take ℓo(1) time per evaluation.
  • Final complexity: T, S

= ˜ O( √ ǫN) ǫ ≤ N−1/3 ˜ O(ǫ5/4N3/4) ǫ ≥ N−1/3

17 / 26

slide-64
SLIDE 64

Lower bound for inverting permutations

  • Given A inverting f on ǫ fraction of inputs in time T and space S,

want to show T · S = Ω(ǫN).

18 / 26

slide-65
SLIDE 65

Lower bound for inverting permutations

  • Given A inverting f on ǫ fraction of inputs in time T and space S,

want to show T · S = Ω(ǫN).

  • Showed by [Yao90] for ǫ = 1 and [GT00], [Wee05] when

T = O( √ ǫN).

18 / 26

slide-66
SLIDE 66

Lower bound for inverting permutations

  • Given A inverting f on ǫ fraction of inputs in time T and space S,

want to show T · S = Ω(ǫN).

  • Showed by [Yao90] for ǫ = 1 and [GT00], [Wee05] when

T = O( √ ǫN).

  • Give a simpler, “randomized” proof that works for all T. Also

extends to lower bounds for PRGs.

18 / 26

slide-67
SLIDE 67

Lower bound for inverting permutations

  • Given A inverting f on ǫ fraction of inputs in time T and space S,

want to show T · S = Ω(ǫN).

  • Showed by [Yao90] for ǫ = 1 and [GT00], [Wee05] when

T = O( √ ǫN).

  • Give a simpler, “randomized” proof that works for all T. Also

extends to lower bounds for PRGs.

  • As in [GT00], show that using A, can encode f with

≈ log(N!) − φ(N, T) + S bits for some φ. Thus, S > φ(N, T) giving the tradeoff between T and S.

18 / 26

slide-68
SLIDE 68

Lower bound for inverting permutations

  • Given A inverting f on ǫ fraction of inputs in time T and space S,

want to show T · S = Ω(ǫN).

  • Showed by [Yao90] for ǫ = 1 and [GT00], [Wee05] when

T = O( √ ǫN).

  • Give a simpler, “randomized” proof that works for all T. Also

extends to lower bounds for PRGs.

  • As in [GT00], show that using A, can encode f with

≈ log(N!) − φ(N, T) + S bits for some φ. Thus, S > φ(N, T) giving the tradeoff between T and S.

  • We show that using A, one can encode f using

≈ log(N!) −

ǫN 100T + S bits giving us the desired tradeoff.

18 / 26

slide-69
SLIDE 69

Intuition for the encoding

[N]

19 / 26

slide-70
SLIDE 70

Intuition for the encoding

[N]

G |G| = ǫN 100T

  • A inverts G correctly.
  • For all x ∈ G, A does not query any element in G.

19 / 26

slide-71
SLIDE 71

Intuition for the encoding

[N]

G |G| = ǫN 100T f(G)

  • A inverts G correctly.
  • For all x ∈ G, A does not query any element in G.

19 / 26

slide-72
SLIDE 72

Intuition for the encoding

[N]

G |G| = ǫN 100T f(G)

20 / 26

slide-73
SLIDE 73

Intuition for the encoding

[N]

G |G| = ǫN 100T f(G)

  • Complexity of encoding :=
  • Size of G
  • Specify set f(G)
  • Specify the map f −1 on [N] − f(G)

20 / 26

slide-74
SLIDE 74

Intuition for the encoding

[N]

G |G| = ǫN 100T f(G)

  • Complexity of encoding :=
  • Size of G
  • Specify set f(G)
  • Specify the map f −1 on [N] − f(G)
  • This information along with A suffices to specify f entirely

20 / 26

slide-75
SLIDE 75

Intuition for the encoding

[N]

G |G| = ǫN 100T f(G)

21 / 26

slide-76
SLIDE 76

Intuition for the encoding

[N]

G |G| = ǫN 100T f(G)

  • Total complexity of encoding : 2 log

N

|G|

  • + log(N − |G|)!
  • Putting |G| =

ǫN 100T , we get that S + ǫN T log(T 2/ǫ2N) ≥ 0

21 / 26

slide-77
SLIDE 77

Upshot of the analysis

  • Provided T ≤ ǫ

√ N, TS = ˜ Ω(ǫN)

22 / 26

slide-78
SLIDE 78

Upshot of the analysis

  • Provided T ≤ ǫ

√ N, TS = ˜ Ω(ǫN)

  • This was the analysis by Gennaro and Trevisan [GT00]

22 / 26

slide-79
SLIDE 79

Upshot of the analysis

  • Provided T ≤ ǫ

√ N, TS = ˜ Ω(ǫN)

  • This was the analysis by Gennaro and Trevisan [GT00]
  • The analysis was improved by Wee [Wee05] who showed

TS = ˜ Ω(ǫN) provided T ≤ √ ǫN

22 / 26

slide-80
SLIDE 80

Upshot of the analysis

  • Provided T ≤ ǫ

√ N, TS = ˜ Ω(ǫN)

  • This was the analysis by Gennaro and Trevisan [GT00]
  • The analysis was improved by Wee [Wee05] who showed

TS = ˜ Ω(ǫN) provided T ≤ √ ǫN

  • There is still a gap because “deterministically” deciding on

G is very expensive.

22 / 26

slide-81
SLIDE 81

Randomized encoding

[N]

G |G| = ǫN 100T f(G) G R f(G) |R| = N 10T

23 / 26

slide-82
SLIDE 82

Randomized encoding

[N]

G |G| = ǫN 100T f(G) G R f(G) |R| = N 10T

  • Choose R to be a set of size N/10T uniformly at random.

23 / 26

slide-83
SLIDE 83

Randomized encoding

[N]

G |G| = ǫN 100T f(G) G R f(G) |R| = N 10T

  • Choose R to be a set of size N/10T uniformly at random.
  • With high probability, this contains a set G of size

ǫN 100T such that

  • A inverts G correctly.
  • For all x ∈ G, A does not query any element in R

23 / 26

slide-84
SLIDE 84

Randomized encoding

[N]

G |G| = ǫN 100T f(G) G R f(G) |R| = N 10T

24 / 26

slide-85
SLIDE 85

Randomized encoding

[N]

G |G| = ǫN 100T f(G) G R f(G) |R| = N 10T

  • Some savings in the analysis as the identity of R is already

known

  • Once we know f outside R, we need to know “G in R” as
  • pposed to “G in [N]” - main source of saving

24 / 26

slide-86
SLIDE 86

Randomized encoding

[N]

G |G| = ǫN 100T f(G) G R f(G) |R| = N 10T

  • Some savings in the analysis as the identity of R is already

known

  • Once we know f outside R, we need to know “G in R” as
  • pposed to “G in [N]” - main source of saving
  • In all, we can describe the permutation in

log(N!) − ǫN/100T + S bits which gives us the result.

24 / 26

slide-87
SLIDE 87

Conclusions

  • Non-uniform attacks can do better than uniform attacks on
  • ne-way functions and PRGs

25 / 26

slide-88
SLIDE 88

Conclusions

  • Non-uniform attacks can do better than uniform attacks on
  • ne-way functions and PRGs
  • The best provable upper bound for one-way functions on

all inputs remains N3/4 and N2/3 is the best for “Hellman”-style arguments (Barkan, Biham and Shamir)

25 / 26

slide-89
SLIDE 89

Conclusions

  • Non-uniform attacks can do better than uniform attacks on
  • ne-way functions and PRGs
  • The best provable upper bound for one-way functions on

all inputs remains N3/4 and N2/3 is the best for “Hellman”-style arguments (Barkan, Biham and Shamir)

  • Techniques for proving lower bounds do not seem to do

any better for one-way functions than permutations i.e. Ω(N1/2).

25 / 26

slide-90
SLIDE 90

Thank You Questions?

26 / 26