Typically-Correct Derandomization for Small Time and Space William - - PowerPoint PPT Presentation

typically correct derandomization for small time and space
SMART_READER_LITE
LIVE PREVIEW

Typically-Correct Derandomization for Small Time and Space William - - PowerPoint PPT Presentation

Typically-Correct Derandomization for Small Time and Space William M. Hoza 1 University of Texas at Austin July 18 CCC 2019 1Supported by the NSF GRFP under Grant No. DGE1610403 and by a Harrington fellowship from UT Austin Time, space, and


slide-1
SLIDE 1

Typically-Correct Derandomization for Small Time and Space

William M. Hoza1 University of Texas at Austin July 18 CCC 2019

1Supported by the NSF GRFP under Grant No. DGE1610403 and by a Harrington fellowship from UT Austin

slide-2
SLIDE 2

Time, space, and randomness

slide-3
SLIDE 3

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

slide-4
SLIDE 4

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n

slide-5
SLIDE 5

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

slide-6
SLIDE 6

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem [Klivans, van Melkebeek ’02]:

slide-7
SLIDE 7

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem [Klivans, van Melkebeek ’02]:

◮ Assume some language in DSPACE(O(n)) has exponential circuit complexity

slide-8
SLIDE 8

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem [Klivans, van Melkebeek ’02]:

◮ Assume some language in DSPACE(O(n)) has exponential circuit complexity ◮ Then L ∈ DTISP(poly(T), S)

slide-9
SLIDE 9

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem [Klivans, van Melkebeek ’02]:

◮ Assume some language in DSPACE(O(n)) has exponential circuit complexity ◮ Then L ∈ DTISP(poly(T), S)

◮ Theorem [Nisan, Zuckerman ’96]:

slide-10
SLIDE 10

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem [Klivans, van Melkebeek ’02]:

◮ Assume some language in DSPACE(O(n)) has exponential circuit complexity ◮ Then L ∈ DTISP(poly(T), S)

◮ Theorem [Nisan, Zuckerman ’96]:

◮ Suppose S ≥ T Ω(1)

slide-11
SLIDE 11

Derandomization

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem [Klivans, van Melkebeek ’02]:

◮ Assume some language in DSPACE(O(n)) has exponential circuit complexity ◮ Then L ∈ DTISP(poly(T), S)

◮ Theorem [Nisan, Zuckerman ’96]:

◮ Suppose S ≥ T Ω(1) ◮ Then L ∈ DSPACE(S) (runtime 2Θ(S))

slide-12
SLIDE 12

Main result

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

slide-13
SLIDE 13

Main result

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem:

slide-14
SLIDE 14

Main result

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem:

◮ Suppose T ≤ n · poly(S)

◮ Think T = O(n), S = O(log n)

slide-15
SLIDE 15

Main result

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem:

◮ Suppose T ≤ n · poly(S) ◮ Then there is a DSPACE(S) algorithm for L...

◮ Think T = O(n), S = O(log n)

slide-16
SLIDE 16

Main result

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem:

◮ Suppose T ≤ n · poly(S) ◮ Then there is a DSPACE(S) algorithm for L... ◮ ...that succeeds on the vast majority of inputs of each length.

◮ Think T = O(n), S = O(log n)

slide-17
SLIDE 17

Main result

◮ Suppose L ∈ BPTISP(T, S)

◮ T = T(n) ≥ n ◮ S = S(n) ≥ log n

◮ Theorem:

◮ Suppose T ≤ n · poly(S) ◮ Then there is a DSPACE(S) algorithm for L... ◮ ...that succeeds on the vast majority of inputs of each length.

◮ Think T = O(n), S = O(log n)

◮ [Saks, Zhou ’95]: Space Θ(log1.5 n)

slide-18
SLIDE 18

Typically-correct derandomizations

◮ Is 110100001101001111001010110111011010100011100 ∈ L?

slide-19
SLIDE 19

Typically-correct derandomizations

◮ Is 110100001101001111001010110111011010100011100 ∈ L?

◮ If only we had some randomness...

slide-20
SLIDE 20

Typically-correct derandomizations

◮ Is 110100001101001111001010110111011010100011100 ∈ L?

◮ If only we had some randomness...

◮ Let A be a randomized algorithm

slide-21
SLIDE 21

Typically-correct derandomizations

◮ Is 110100001101001111001010110111011010100011100 ∈ L?

◮ If only we had some randomness...

◮ Let A be a randomized algorithm ◮ Na¨ ıve derandomization: Run A(x, x)

slide-22
SLIDE 22

Typically-correct derandomizations

◮ Is 110100001101001111001010110111011010100011100 ∈ L?

◮ If only we had some randomness...

◮ Let A be a randomized algorithm ◮ Na¨ ıve derandomization: Run A(x, x) ◮ Might fail on all x because of correlations between input, coins

slide-23
SLIDE 23

Prior techniques for dealing with correlations

  • 1. Find algorithm A where most random strings are good for all

inputs simultaneously

slide-24
SLIDE 24

Prior techniques for dealing with correlations

  • 1. Find algorithm A where most random strings are good for all

inputs simultaneously

◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity

slide-25
SLIDE 25

Prior techniques for dealing with correlations

  • 1. Find algorithm A where most random strings are good for all

inputs simultaneously

◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism

slide-26
SLIDE 26

Prior techniques for dealing with correlations

  • 1. Find algorithm A where most random strings are good for all

inputs simultaneously

◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism

  • 2. Extract randomness from input using specialized extractor
slide-27
SLIDE 27

Prior techniques for dealing with correlations

  • 1. Find algorithm A where most random strings are good for all

inputs simultaneously

◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism

  • 2. Extract randomness from input using specialized extractor

◮ [Zimand ’08]: Sublinear time algorithms

slide-28
SLIDE 28

Prior techniques for dealing with correlations

  • 1. Find algorithm A where most random strings are good for all

inputs simultaneously

◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism

  • 2. Extract randomness from input using specialized extractor

◮ [Zimand ’08]: Sublinear time algorithms ◮ [Shaltiel ’11]: Two-party communication protocols, streaming algorithms, BPAC0

slide-29
SLIDE 29

Prior techniques for dealing with correlations

  • 1. Find algorithm A where most random strings are good for all

inputs simultaneously

◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism

  • 2. Extract randomness from input using specialized extractor

◮ [Zimand ’08]: Sublinear time algorithms ◮ [Shaltiel ’11]: Two-party communication protocols, streaming algorithms, BPAC0

  • 3. Plug input into seed-extending pseudorandom generator
slide-30
SLIDE 30

Prior techniques for dealing with correlations

  • 1. Find algorithm A where most random strings are good for all

inputs simultaneously

◮ [Goldreich, Wigderson ’02]: Undirected s-t connectivity ◮ [Arvind, Tor´ an ’04]: Solvable group isomorphism

  • 2. Extract randomness from input using specialized extractor

◮ [Zimand ’08]: Sublinear time algorithms ◮ [Shaltiel ’11]: Two-party communication protocols, streaming algorithms, BPAC0

  • 3. Plug input into seed-extending pseudorandom generator

◮ [Kinne, van Melkebeek, Shaltiel ’12]: Multiparty communication protocols, BPAC0 with symmetric gates

slide-31
SLIDE 31

Our technique: “Out of sight, out of mind”

110100001101001111001010110111011010100011100

slide-32
SLIDE 32

Our technique: “Out of sight, out of mind”

◮ Use part of the input as a source of randomness while A is processing the rest of the input A 110100001101001111001010110111011010100011100

T H H T H T T H H T H T H

slide-33
SLIDE 33

Our technique: “Out of sight, out of mind”

◮ Use part of the input as a source of randomness while A is processing the rest of the input A 110100001101001111001010110111011010100011100

T H H T H T T H H T H T H

◮ (Additional ideas needed to make this work...)

slide-34
SLIDE 34

Restriction of algorithm

110100001101001111001010110111011010100011100 ◮ Let I ⊆ [n]

slide-35
SLIDE 35

Restriction of algorithm

110100001101001111001010110111011010100011100 ◮ Let I ⊆ [n] ◮ Algorithm A|[n]\I:

slide-36
SLIDE 36

Restriction of algorithm

110100001101001111001010110111011010100011100 ◮ Let I ⊆ [n] ◮ Algorithm A|[n]\I:

  • 1. Run A like normal...
slide-37
SLIDE 37

Restriction of algorithm

110100001101001111001010110111011010100011100 ◮ Let I ⊆ [n] ◮ Algorithm A|[n]\I:

  • 1. Run A like normal...
  • 2. ...except, if A is about to query xi for some i ∈ I, halt

immediately.

slide-38
SLIDE 38

Restriction of algorithm

110100001101001111001010110111011010100011100 ◮ Let I ⊆ [n] ◮ Algorithm A|[n]\I:

  • 1. Run A like normal...
  • 2. ...except, if A is about to query xi for some i ∈ I, halt

immediately.

slide-39
SLIDE 39

Main Lemma: Reducing randomness to polylog n

◮ Main Lemma:

slide-40
SLIDE 40

Main Lemma: Reducing randomness to polylog n

◮ Main Lemma:

◮ Suppose L ∈ BPTISP( O(n), log n)

slide-41
SLIDE 41

Main Lemma: Reducing randomness to polylog n

◮ Main Lemma:

◮ Suppose L ∈ BPTISP( O(n), log n) ◮ There is a BPL algorithm for L that uses just polylog n random bits (one-way access)...

slide-42
SLIDE 42

Main Lemma: Reducing randomness to polylog n

◮ Main Lemma:

◮ Suppose L ∈ BPTISP( O(n), log n) ◮ There is a BPL algorithm for L that uses just polylog n random bits (one-way access)... ◮ ...that succeeds on the vast majority of inputs of each length.

slide-43
SLIDE 43

Tool 1: Nisan’s Pseudorandom Generator

NisGen T = O(n) bits s = Θ(log2 n) bits ◮ For any O(log n)-space A, input x, A(x, NisGen(Us)) ≈ A(x, UT)

slide-44
SLIDE 44

Tool 1: Nisan’s Pseudorandom Generator

NisGen T = O(n) bits s = Θ(log2 n) bits ◮ For any O(log n)-space A, input x, A(x, NisGen(Us)) ≈ A(x, UT) ◮ Runs in space O(log n)...

slide-45
SLIDE 45

Tool 1: Nisan’s Pseudorandom Generator

NisGen T = O(n) bits s = Θ(log2 n) bits ◮ For any O(log n)-space A, input x, A(x, NisGen(Us)) ≈ A(x, UT) ◮ Runs in space O(log n)... ◮ ...given two-way access to seed

slide-46
SLIDE 46

Tool 2: Shaltiel-Umans Averaging Sampler

SUSamp O(log100 n) bits s = Θ(log2 n) bits d = O(log n) bits ◮ For any F : {0, 1}s → V , for most x, F(SUSamp(x, Ud)) ≈ F(Us)

slide-47
SLIDE 47

Tool 2: Shaltiel-Umans Averaging Sampler

SUSamp O(log100 n) bits s = Θ(log2 n) bits d = O(log n) bits ◮ For any F : {0, 1}s → V , for most x, F(SUSamp(x, Ud)) ≈ F(Us) ◮ # bad x: at most 2O(log6 n) · |V |

slide-48
SLIDE 48

Tool 2: Shaltiel-Umans Averaging Sampler

SUSamp O(log100 n) bits s = Θ(log2 n) bits d = O(log n) bits ◮ For any F : {0, 1}s → V , for most x, F(SUSamp(x, Ud)) ≈ F(Us) ◮ # bad x: at most 2O(log6 n) · |V | ◮ Runs in space O(log n) given two-way access to x

slide-49
SLIDE 49

Algorithm of Main Lemma

110100001101001111001010110111011010100011100

slide-50
SLIDE 50

Algorithm of Main Lemma

110100001101001111001010110111011010100011100

  • 1. Let v be the initial configuration of A
slide-51
SLIDE 51

Algorithm of Main Lemma

110100001101001111001010110111011010100011100

  • 1. Let v be the initial configuration of A
  • 2. Loop for polylog(n) iterations:
slide-52
SLIDE 52

Algorithm of Main Lemma

log100 n 110100001101001111001010110111011010100011100

  • 1. Let v be the initial configuration of A
  • 2. Loop for polylog(n) iterations:

2.1 Pick a random contiguous block I ⊆ [n]

slide-53
SLIDE 53

Algorithm of Main Lemma

log100 n 110100001101001111001010110111011010100011100

  • 1. Let v be the initial configuration of A
  • 2. Loop for polylog(n) iterations:

2.1 Pick a random contiguous block I ⊆ [n] 2.2 Pick a random y ∈ {0, 1}O(log n)

slide-54
SLIDE 54

Algorithm of Main Lemma

log100 n SUSamp NisGen 110100001101001111001010110111011010100011100

  • 1. Let v be the initial configuration of A
  • 2. Loop for polylog(n) iterations:

2.1 Pick a random contiguous block I ⊆ [n] 2.2 Pick a random y ∈ {0, 1}O(log n) 2.3 Let z = NisGen(SUSamp(x|I, y))

slide-55
SLIDE 55

Algorithm of Main Lemma

log100 n SUSamp NisGen 110100001101001111001010110111011010100011100

  • 1. Let v be the initial configuration of A
  • 2. Loop for polylog(n) iterations:

2.1 Pick a random contiguous block I ⊆ [n] 2.2 Pick a random y ∈ {0, 1}O(log n) 2.3 Let z = NisGen(SUSamp(x|I, y)) 2.4 Run A|[n]\I(x, z) from configuration v until it halts

slide-56
SLIDE 56

Algorithm of Main Lemma

log100 n SUSamp NisGen 110100001101001111001010110111011010100011100

  • 1. Let v be the initial configuration of A
  • 2. Loop for polylog(n) iterations:

2.1 Pick a random contiguous block I ⊆ [n] 2.2 Pick a random y ∈ {0, 1}O(log n) 2.3 Let z = NisGen(SUSamp(x|I, y)) 2.4 Run A|[n]\I(x, z) from configuration v until it halts 2.5 Update v to be the final configuration

slide-57
SLIDE 57

Algorithm of Main Lemma

log100 n SUSamp NisGen 110100001101001111001010110111011010100011100

  • 1. Let v be the initial configuration of A
  • 2. Loop for polylog(n) iterations:

2.1 Pick a random contiguous block I ⊆ [n] 2.2 Pick a random y ∈ {0, 1}O(log n) 2.3 Let z = NisGen(SUSamp(x|I, y)) 2.4 Run A|[n]\I(x, z) from configuration v until it halts 2.5 Update v to be the final configuration

  • 3. Accept if v is an accepting configuration, else reject
slide-58
SLIDE 58

Efficiency analysis

  • 1. Loop for polylog(n) iterations:

1.1 Pick a random contiguous block I ⊆ [n] 1.2 Pick a random y ∈ {0, 1}O(log n) 1.3 Let z = NisGen(SUSamp(x|I, y)) 1.4 Run A|[n]\I(x, z) from configuration v until it halts 1.5 Update v to be the final configuration

  • 2. Accept iff v accepts
slide-59
SLIDE 59

Efficiency analysis

  • 1. Loop for polylog(n) iterations:

1.1 Pick a random contiguous block I ⊆ [n] 1.2 Pick a random y ∈ {0, 1}O(log n) 1.3 Let z = NisGen(SUSamp(x|I, y)) 1.4 Run A|[n]\I(x, z) from configuration v until it halts 1.5 Update v to be the final configuration

  • 2. Accept iff v accepts

◮ Runs in O(log n) space!

slide-60
SLIDE 60

Efficiency analysis

  • 1. Loop for polylog(n) iterations:

1.1 Pick a random contiguous block I ⊆ [n] 1.2 Pick a random y ∈ {0, 1}O(log n) 1.3 Let z = NisGen(SUSamp(x|I, y)) 1.4 Run A|[n]\I(x, z) from configuration v until it halts 1.5 Update v to be the final configuration

  • 2. Accept iff v accepts

◮ Runs in O(log n) space!

◮ O(log n) bits to store I, y, v

slide-61
SLIDE 61

Efficiency analysis

  • 1. Loop for polylog(n) iterations:

1.1 Pick a random contiguous block I ⊆ [n] 1.2 Pick a random y ∈ {0, 1}O(log n) 1.3 Let z = NisGen(SUSamp(x|I, y)) 1.4 Run A|[n]\I(x, z) from configuration v until it halts 1.5 Update v to be the final configuration

  • 2. Accept iff v accepts

◮ Runs in O(log n) space!

◮ O(log n) bits to store I, y, v ◮ O(log n) bits to run SUSamp, NisGen, A|[n]\I

slide-62
SLIDE 62

Efficiency analysis

  • 1. Loop for polylog(n) iterations:

1.1 Pick a random contiguous block I ⊆ [n] 1.2 Pick a random y ∈ {0, 1}O(log n) 1.3 Let z = NisGen(SUSamp(x|I, y)) 1.4 Run A|[n]\I(x, z) from configuration v until it halts 1.5 Update v to be the final configuration

  • 2. Accept iff v accepts

◮ Runs in O(log n) space!

◮ O(log n) bits to store I, y, v ◮ O(log n) bits to run SUSamp, NisGen, A|[n]\I

◮ We can give SUSamp, NisGen two-way access to their inputs, because we have two-way access to x

slide-63
SLIDE 63

Efficiency analysis

  • 1. Loop for polylog(n) iterations:

1.1 Pick a random contiguous block I ⊆ [n] 1.2 Pick a random y ∈ {0, 1}O(log n) 1.3 Let z = NisGen(SUSamp(x|I, y)) 1.4 Run A|[n]\I(x, z) from configuration v until it halts 1.5 Update v to be the final configuration

  • 2. Accept iff v accepts

◮ Runs in O(log n) space!

◮ O(log n) bits to store I, y, v ◮ O(log n) bits to run SUSamp, NisGen, A|[n]\I

◮ We can give SUSamp, NisGen two-way access to their inputs, because we have two-way access to x ◮ Randomness polylog n (one-way access!)

slide-64
SLIDE 64

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration
slide-65
SLIDE 65

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

slide-66
SLIDE 66

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
slide-67
SLIDE 67

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
slide-68
SLIDE 68

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration
slide-69
SLIDE 69

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

◮ Let F(y′) = final configuration when running A|[n]\I(x, NisGen(y′)) from v

slide-70
SLIDE 70

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

◮ Let F(y′) = final configuration when running A|[n]\I(x, NisGen(y′)) from v ◮ # bad x bounded by

slide-71
SLIDE 71

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

◮ Let F(y′) = final configuration when running A|[n]\I(x, NisGen(y′)) from v ◮ # bad x bounded by (2O(log6 n) · poly(n))

  • # bad x|I

·

slide-72
SLIDE 72

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

◮ Let F(y′) = final configuration when running A|[n]\I(x, NisGen(y′)) from v ◮ # bad x bounded by (2O(log6 n) · poly(n))

  • # bad x|I

· (2n−log100 n)

  • # x|[n]\I

·

slide-73
SLIDE 73

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

◮ Let F(y′) = final configuration when running A|[n]\I(x, NisGen(y′)) from v ◮ # bad x bounded by (2O(log6 n) · poly(n))

  • # bad x|I

· (2n−log100 n)

  • # x|[n]\I

· (n)

  • # I

·

slide-74
SLIDE 74

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

◮ Let F(y′) = final configuration when running A|[n]\I(x, NisGen(y′)) from v ◮ # bad x bounded by (2O(log6 n) · poly(n))

  • # bad x|I

· (2n−log100 n)

  • # x|[n]\I

· (n)

  • # I

· (poly(n))

  • # v
slide-75
SLIDE 75

Correctness proof sketch

Our algorithm

  • 1. Pick random y ∈ {0, 1}O(log n)
  • 2. Let z = NisGen(SUSamp(x|I, y))
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

◮ Let F(y′) = final configuration when running A|[n]\I(x, NisGen(y′)) from v ◮ # bad x bounded by (2O(log6 n) · poly(n))

  • # bad x|I

· (2n−log100 n)

  • # x|[n]\I

· (n)

  • # I

· (poly(n))

  • # v

≪ 2n

slide-76
SLIDE 76

Correctness proof sketch (2)

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration
slide-77
SLIDE 77

Correctness proof sketch (2)

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

Second hybrid distribution

slide-78
SLIDE 78

Correctness proof sketch (2)

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

Second hybrid distribution

  • 1. Pick random z ∈ {0, 1}T
slide-79
SLIDE 79

Correctness proof sketch (2)

First hybrid distribution

  • 1. Pick random y′ ∈ {0, 1}O(log2 n)
  • 2. Let z = NisGen(y′)
  • 3. Run A|[n]\I(x, z) from v
  • 4. Update v := final configuration

Second hybrid distribution

  • 1. Pick random z ∈ {0, 1}T
  • 2. Run A|[n]\I(x, z) from v
  • 3. Update v := final configuration
slide-80
SLIDE 80

Correctness proof sketch (3)

Second hybrid distribution

  • 1. Repeat polylog(n) times:

1.1 Pick random I ⊆ [n] 1.2 Pick random z ∈ {0, 1}T 1.3 Run A|[n]\I(x, z) from v 1.4 Update v := final conf.

  • 2. Accept iff v accepts
slide-81
SLIDE 81

Correctness proof sketch (3)

Second hybrid distribution

  • 1. Repeat polylog(n) times:

1.1 Pick random I ⊆ [n] 1.2 Pick random z ∈ {0, 1}T 1.3 Run A|[n]\I(x, z) from v 1.4 Update v := final conf.

  • 2. Accept iff v accepts

Target distribution

slide-82
SLIDE 82

Correctness proof sketch (3)

Second hybrid distribution

  • 1. Repeat polylog(n) times:

1.1 Pick random I ⊆ [n] 1.2 Pick random z ∈ {0, 1}T 1.3 Run A|[n]\I(x, z) from v 1.4 Update v := final conf.

  • 2. Accept iff v accepts

Target distribution

  • 1. Pick random z ∈ {0, 1}T
slide-83
SLIDE 83

Correctness proof sketch (3)

Second hybrid distribution

  • 1. Repeat polylog(n) times:

1.1 Pick random I ⊆ [n] 1.2 Pick random z ∈ {0, 1}T 1.3 Run A|[n]\I(x, z) from v 1.4 Update v := final conf.

  • 2. Accept iff v accepts

Target distribution

  • 1. Pick random z ∈ {0, 1}T
  • 2. Accept iff A(x, z) accepts
slide-84
SLIDE 84

Correctness proof sketch (3)

Second hybrid distribution

  • 1. Repeat polylog(n) times:

1.1 Pick random I ⊆ [n] 1.2 Pick random z ∈ {0, 1}T 1.3 Run A|[n]\I(x, z) from v 1.4 Update v := final conf.

  • 2. Accept iff v accepts

Target distribution

  • 1. Pick random z ∈ {0, 1}T
  • 2. Accept iff A(x, z) accepts

◮ In each phase, simulate Ω(n/ log100 n) steps of A w.h.p.

slide-85
SLIDE 85

Correctness proof sketch (3)

Second hybrid distribution

  • 1. Repeat polylog(n) times:

1.1 Pick random I ⊆ [n] 1.2 Pick random z ∈ {0, 1}T 1.3 Run A|[n]\I(x, z) from v 1.4 Update v := final conf.

  • 2. Accept iff v accepts

Target distribution

  • 1. Pick random z ∈ {0, 1}T
  • 2. Accept iff A(x, z) accepts

◮ In each phase, simulate Ω(n/ log100 n) steps of A w.h.p. ◮ After polylog(n) phases, reach halting configuration w.h.p.

slide-86
SLIDE 86

Tool 3: The Nisan-Zuckerman PRG

NZGen logc n bits d = O(log n) bits ◮ c = arbitrarily large constant

slide-87
SLIDE 87

Tool 3: The Nisan-Zuckerman PRG

NZGen logc n bits d = O(log n) bits ◮ c = arbitrarily large constant ◮ For any O(log n)-space A, input x, A(x, NZGen(Ud)) ≈ A(x, Ulogc n)

slide-88
SLIDE 88

Tool 3: The Nisan-Zuckerman PRG

NZGen logc n bits d = O(log n) bits ◮ c = arbitrarily large constant ◮ For any O(log n)-space A, input x, A(x, NZGen(Ud)) ≈ A(x, Ulogc n) ◮ Runs in space O(log n)

slide-89
SLIDE 89

Eliminating randomness

◮ Suppose L ∈ BPTISP( O(n), log n)

slide-90
SLIDE 90

Eliminating randomness

◮ Suppose L ∈ BPTISP( O(n), log n) ◮ Corollary:

slide-91
SLIDE 91

Eliminating randomness

◮ Suppose L ∈ BPTISP( O(n), log n) ◮ Corollary:

◮ There is a BPL algorithm for L that uses just O(log n) random bits...

slide-92
SLIDE 92

Eliminating randomness

◮ Suppose L ∈ BPTISP( O(n), log n) ◮ Corollary:

◮ There is a BPL algorithm for L that uses just O(log n) random bits... ◮ ...that succeeds on the vast majority of inputs of each length.

slide-93
SLIDE 93

Eliminating randomness

◮ Suppose L ∈ BPTISP( O(n), log n) ◮ Corollary:

◮ There is a BPL algorithm for L that uses just O(log n) random bits... ◮ ...that succeeds on the vast majority of inputs of each length.

◮ Corollary:

slide-94
SLIDE 94

Eliminating randomness

◮ Suppose L ∈ BPTISP( O(n), log n) ◮ Corollary:

◮ There is a BPL algorithm for L that uses just O(log n) random bits... ◮ ...that succeeds on the vast majority of inputs of each length.

◮ Corollary:

◮ There is a DSPACE(log n) algorithm for L...

slide-95
SLIDE 95

Eliminating randomness

◮ Suppose L ∈ BPTISP( O(n), log n) ◮ Corollary:

◮ There is a BPL algorithm for L that uses just O(log n) random bits... ◮ ...that succeeds on the vast majority of inputs of each length.

◮ Corollary:

◮ There is a DSPACE(log n) algorithm for L... ◮ ...that succeeds on the vast majority of inputs of each length.

slide-96
SLIDE 96

Main open problem

◮ Typically-correct derandomization of BPL?

slide-97
SLIDE 97

Main open problem

◮ Typically-correct derandomization of BPL? ◮ Thanks! Questions?