Probabilistic Computation Lecture 15 Computing with Less - - PowerPoint PPT Presentation

probabilistic computation
SMART_READER_LITE
LIVE PREVIEW

Probabilistic Computation Lecture 15 Computing with Less - - PowerPoint PPT Presentation

Probabilistic Computation Lecture 15 Computing with Less Randomness, or with Imperfect Randomness 1 Soundness Amplification for BPP 2 Soundness Amplification for BPP Repeat M(x) t times and take majority 2 Soundness Amplification for


slide-1
SLIDE 1

Probabilistic Computation

Lecture 15 Computing with Less Randomness, or with Imperfect Randomness

1

slide-2
SLIDE 2

Soundness Amplification for BPP

2

slide-3
SLIDE 3

Soundness Amplification for BPP

Repeat M(x) t times and take majority

2

slide-4
SLIDE 4

Soundness Amplification for BPP

Repeat M(x) t times and take majority i.e. estimate Pr[M(x)=yes] and check if it is > 1/2

2

slide-5
SLIDE 5

Soundness Amplification for BPP

Repeat M(x) t times and take majority i.e. estimate Pr[M(x)=yes] and check if it is > 1/2 Error only if | estimate-real | ≥ gap/2

2

slide-6
SLIDE 6

Soundness Amplification for BPP

Repeat M(x) t times and take majority i.e. estimate Pr[M(x)=yes] and check if it is > 1/2 Error only if | estimate-real | ≥ gap/2 Estimation error goes down exponentially with t: Chernoff bound

2

slide-7
SLIDE 7

Soundness Amplification for BPP

Repeat M(x) t times and take majority i.e. estimate Pr[M(x)=yes] and check if it is > 1/2 Error only if | estimate-real | ≥ gap/2 Estimation error goes down exponentially with t: Chernoff bound Pr[ |estimate - real| ≥ δ/2 ] ≤ 2-Ω(t.δ^2)

2

slide-8
SLIDE 8

Soundness Amplification for BPP

Repeat M(x) t times and take majority i.e. estimate Pr[M(x)=yes] and check if it is > 1/2 Error only if | estimate-real | ≥ gap/2 Estimation error goes down exponentially with t: Chernoff bound Pr[ |estimate - real| ≥ δ/2 ] ≤ 2-Ω(t.δ^2) t = O(nd/δ2) enough for Pr[error] ≤ 2-n^d

2

slide-9
SLIDE 9

Randomness Efficient Soundness Amplification

3

slide-10
SLIDE 10

Randomness Efficient Soundness Amplification

In repeating t times (to reduce error to 2-Ω(t)) number of coins used = t.m

3

slide-11
SLIDE 11

Randomness Efficient Soundness Amplification

In repeating t times (to reduce error to 2-Ω(t)) number of coins used = t.m Used independent random tapes to get error 2-Ω(t)

3

slide-12
SLIDE 12

Randomness Efficient Soundness Amplification

In repeating t times (to reduce error to 2-Ω(t)) number of coins used = t.m Used independent random tapes to get error 2-Ω(t) Can use very dependent tapes and still get error 2-Ω(t)! (but with a smaller constant inside Ω)

3

slide-13
SLIDE 13

Randomness Efficient Soundness Amplification

In repeating t times (to reduce error to 2-Ω(t)) number of coins used = t.m Used independent random tapes to get error 2-Ω(t) Can use very dependent tapes and still get error 2-Ω(t)! (but with a smaller constant inside Ω) Random tapes produced using a random walk on an “expander graph”

3

slide-14
SLIDE 14

Randomness Efficient Soundness Amplification

In repeating t times (to reduce error to 2-Ω(t)) number of coins used = t.m Used independent random tapes to get error 2-Ω(t) Can use very dependent tapes and still get error 2-Ω(t)! (but with a smaller constant inside Ω) Random tapes produced using a random walk on an “expander graph”

  • No. of coins used = m + O(t)

3

slide-15
SLIDE 15

Randomness Efficient Soundness Amplification

4

slide-16
SLIDE 16

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p.

Randomness Efficient Soundness Amplification

4

slide-17
SLIDE 17

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p.

Randomness Efficient Soundness Amplification

4

slide-18
SLIDE 18

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p.

Randomness Efficient Soundness Amplification

4

slide-19
SLIDE 19

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p. By Chernoff, if p’ is the estimate from t independent samples, then Pr[|p’-p|> εp] < 2-Ω(t.ε^2)

Randomness Efficient Soundness Amplification

4

slide-20
SLIDE 20

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p. By Chernoff, if p’ is the estimate from t independent samples, then Pr[|p’-p|> εp] < 2-Ω(t.ε^2) Random walk: superimpose an “expander graph” on this

  • space. Pick first point at random, and then do random walk
  • f length t using the graph edges. Estimate p’ = fraction of

yes nodes along the path

Randomness Efficient Soundness Amplification

4

slide-21
SLIDE 21

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p. By Chernoff, if p’ is the estimate from t independent samples, then Pr[|p’-p|> εp] < 2-Ω(t.ε^2) Random walk: superimpose an “expander graph” on this

  • space. Pick first point at random, and then do random walk
  • f length t using the graph edges. Estimate p’ = fraction of

yes nodes along the path

Randomness Efficient Soundness Amplification

4

slide-22
SLIDE 22

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p. By Chernoff, if p’ is the estimate from t independent samples, then Pr[|p’-p|> εp] < 2-Ω(t.ε^2) Random walk: superimpose an “expander graph” on this

  • space. Pick first point at random, and then do random walk
  • f length t using the graph edges. Estimate p’ = fraction of

yes nodes along the path

Randomness Efficient Soundness Amplification

4

slide-23
SLIDE 23

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p. By Chernoff, if p’ is the estimate from t independent samples, then Pr[|p’-p|> εp] < 2-Ω(t.ε^2) Random walk: superimpose an “expander graph” on this

  • space. Pick first point at random, and then do random walk
  • f length t using the graph edges. Estimate p’ = fraction of

yes nodes along the path Expander’ s degree is constant: coins needed = m + O(t)

Randomness Efficient Soundness Amplification

4

slide-24
SLIDE 24

Space of all random tapes = {0,1}m. Consider a subset (“yes” set). To estimate its weight p. By Chernoff, if p’ is the estimate from t independent samples, then Pr[|p’-p|> εp] < 2-Ω(t.ε^2) Random walk: superimpose an “expander graph” on this

  • space. Pick first point at random, and then do random walk
  • f length t using the graph edges. Estimate p’ = fraction of

yes nodes along the path Expander’ s degree is constant: coins needed = m + O(t) Expander “mixing”: Pr[|p’-p|> εp] < 2-Ω(t.ε^2) (but with a smaller constant inside Ω)

Randomness Efficient Soundness Amplification

4

slide-25
SLIDE 25

Soundness Amplification

5

slide-26
SLIDE 26

Soundness Amplification

Probabilistic Approximately Correct estimation of Pr[yes]

5

slide-27
SLIDE 27

Soundness Amplification

Probabilistic Approximately Correct estimation of Pr[yes] Bounded gap: so enough to approximate

5

slide-28
SLIDE 28

Soundness Amplification

Probabilistic Approximately Correct estimation of Pr[yes] Bounded gap: so enough to approximate A small probability of error still allowed

5

slide-29
SLIDE 29

Soundness Amplification

Probabilistic Approximately Correct estimation of Pr[yes] Bounded gap: so enough to approximate A small probability of error still allowed Not “derandomization”

5

slide-30
SLIDE 30

Soundness Amplification

Probabilistic Approximately Correct estimation of Pr[yes] Bounded gap: so enough to approximate A small probability of error still allowed Not “derandomization” Trying to minimize amount of randomness used

5

slide-31
SLIDE 31

Soundness Amplification

Probabilistic Approximately Correct estimation of Pr[yes] Bounded gap: so enough to approximate A small probability of error still allowed Not “derandomization” Trying to minimize amount of randomness used Still need perfectly random bits (fair, independent coin tosses)

5

slide-32
SLIDE 32

Soundness Amplification

Probabilistic Approximately Correct estimation of Pr[yes] Bounded gap: so enough to approximate A small probability of error still allowed Not “derandomization” Trying to minimize amount of randomness used Still need perfectly random bits (fair, independent coin tosses) Not a realistic assumption on random sources

5

slide-33
SLIDE 33

Soundness Amplification

Probabilistic Approximately Correct estimation of Pr[yes] Bounded gap: so enough to approximate A small probability of error still allowed Not “derandomization” Trying to minimize amount of randomness used Still need perfectly random bits (fair, independent coin tosses) Not a realistic assumption on random sources Can we work with imperfect random sources?

5

slide-34
SLIDE 34

Philosophical Issues with Randomness/Probability

6

slide-35
SLIDE 35

Philosophical Issues with Randomness/Probability

6

slide-36
SLIDE 36

Imperfect Randomness

7

slide-37
SLIDE 37

Imperfect Randomness

Perfect

7

slide-38
SLIDE 38

Imperfect Randomness

Perfect Fair coin flips

7

slide-39
SLIDE 39

Imperfect Randomness

Perfect Fair coin flips Slightly imperfect

7

slide-40
SLIDE 40

Imperfect Randomness

Perfect Fair coin flips Slightly imperfect Sufficient unpredictability (entropy)

7

slide-41
SLIDE 41

Imperfect Randomness

Perfect Fair coin flips Slightly imperfect Sufficient unpredictability (entropy) Sufficient independence

7

slide-42
SLIDE 42

Imperfect Randomness

Perfect Fair coin flips Slightly imperfect Sufficient unpredictability (entropy) Sufficient independence Don’t know the exact distribution, but belongs to a known class of distributions

7

slide-43
SLIDE 43

Imperfect Randomness

8

slide-44
SLIDE 44

Imperfect Randomness

Bit-wise guarantee

8

slide-45
SLIDE 45

Imperfect Randomness

Bit-wise guarantee von Neumann source

8

slide-46
SLIDE 46

Imperfect Randomness

Bit-wise guarantee von Neumann source Independent but not fair: Each bit is independent of previous bits, but with a bias. Bias is same for all bits.

8

slide-47
SLIDE 47

Imperfect Randomness

Bit-wise guarantee von Neumann source Independent but not fair: Each bit is independent of previous bits, but with a bias. Bias is same for all bits. Santha-Vazirani source

8

slide-48
SLIDE 48

Imperfect Randomness

Bit-wise guarantee von Neumann source Independent but not fair: Each bit is independent of previous bits, but with a bias. Bias is same for all bits. Santha-Vazirani source Dependent bits of varying bias: Each bit can depend on all previous bits, but Pr[bi=0], Pr[bi=1] ∈ [1/2-δ/2, 1/2+δ/2], even conditioned on all previous bits (i.e., sufficiently unpredictable)

8

slide-49
SLIDE 49

Imperfect Randomness

Bit-wise guarantee von Neumann source Independent but not fair: Each bit is independent of previous bits, but with a bias. Bias is same for all bits. Santha-Vazirani source Dependent bits of varying bias: Each bit can depend on all previous bits, but Pr[bi=0], Pr[bi=1] ∈ [1/2-δ/2, 1/2+δ/2], even conditioned on all previous bits (i.e., sufficiently unpredictable) Weaker guarantees: e.g. Block source

8

slide-50
SLIDE 50

BPP using imperfect randomness

9

slide-51
SLIDE 51

BPP using imperfect randomness

Small bias (1/m, where m coins in all) SV source is harmless:

9

slide-52
SLIDE 52

BPP using imperfect randomness

Small bias (1/m, where m coins in all) SV source is harmless: Any string has weight at most (1/2+δ/2)m

9

slide-53
SLIDE 53

BPP using imperfect randomness

Small bias (1/m, where m coins in all) SV source is harmless: Any string has weight at most (1/2+δ/2)m

U s i n g b

  • u

n d

  • n

c

  • n

d i t i

  • n

a l p r

  • b

a b i l i t y

9

slide-54
SLIDE 54

BPP using imperfect randomness

Small bias (1/m, where m coins in all) SV source is harmless: Any string has weight at most (1/2+δ/2)m t strings can have weight at most t.(1/2+δ/2)m

U s i n g b

  • u

n d

  • n

c

  • n

d i t i

  • n

a l p r

  • b

a b i l i t y

9

slide-55
SLIDE 55

BPP using imperfect randomness

Small bias (1/m, where m coins in all) SV source is harmless: Any string has weight at most (1/2+δ/2)m t strings can have weight at most t.(1/2+δ/2)m t.(1/2+δ/2)m = (t/2m).(1+δ)m < (t/2m).e if δ < 1/m

U s i n g b

  • u

n d

  • n

c

  • n

d i t i

  • n

a l p r

  • b

a b i l i t y

9

slide-56
SLIDE 56

BPP using imperfect randomness

Small bias (1/m, where m coins in all) SV source is harmless: Any string has weight at most (1/2+δ/2)m t strings can have weight at most t.(1/2+δ/2)m t.(1/2+δ/2)m = (t/2m).(1+δ)m < (t/2m).e if δ < 1/m

U s i n g b

  • u

n d

  • n

c

  • n

d i t i

  • n

a l p r

  • b

a b i l i t y ( 1 + x )1/x ≤ e

9

slide-57
SLIDE 57

BPP using imperfect randomness

Small bias (1/m, where m coins in all) SV source is harmless: Any string has weight at most (1/2+δ/2)m t strings can have weight at most t.(1/2+δ/2)m t.(1/2+δ/2)m = (t/2m).(1+δ)m < (t/2m).e if δ < 1/m If on perfect randomness, Pr[error] < 1/(e2n), then on imperfect randomness with bias < 1/m, Pr[error] < 1/2n

U s i n g b

  • u

n d

  • n

c

  • n

d i t i

  • n

a l p r

  • b

a b i l i t y ( 1 + x )1/x ≤ e

9

slide-58
SLIDE 58

BPP using imperfect randomness

10

slide-59
SLIDE 59

BPP using imperfect randomness

Handling more imperfectness

10

slide-60
SLIDE 60

BPP using imperfect randomness

Handling more imperfectness by pre-processing the randomness

10

slide-61
SLIDE 61

BPP using imperfect randomness

Handling more imperfectness by pre-processing the randomness Randomness extraction

10

slide-62
SLIDE 62

BPP using imperfect randomness

Handling more imperfectness by pre-processing the randomness Randomness extraction Simple Extractor:

10

slide-63
SLIDE 63

BPP using imperfect randomness

Handling more imperfectness by pre-processing the randomness Randomness extraction Simple Extractor:

Ext

10

slide-64
SLIDE 64

BPP using imperfect randomness

Handling more imperfectness by pre-processing the randomness Randomness extraction Simple Extractor:

Ext

Biased input

10

slide-65
SLIDE 65

BPP using imperfect randomness

Handling more imperfectness by pre-processing the randomness Randomness extraction Simple Extractor:

Ext

Biased input Almost unbiased

  • utput

10

slide-66
SLIDE 66

Simple extractor for von Neumann Sources

11

slide-67
SLIDE 67

Extraction for von Neumann sources

Simple extractor for von Neumann Sources

11

slide-68
SLIDE 68

Extraction for von Neumann sources

Simple extractor for von Neumann Sources

11

slide-69
SLIDE 69

Extraction for von Neumann sources

Simple extractor for von Neumann Sources

Case r2i r2i+1: 01: output 0 10: output 1 *: discard

11

slide-70
SLIDE 70

Extraction for von Neumann sources Perfectly random output

Simple extractor for von Neumann Sources

Case r2i r2i+1: 01: output 0 10: output 1 *: discard

11

slide-71
SLIDE 71

Extraction for von Neumann sources Perfectly random output Fewer output bits

Simple extractor for von Neumann Sources

Case r2i r2i+1: 01: output 0 10: output 1 *: discard

11

slide-72
SLIDE 72

Extraction for von Neumann sources Perfectly random output Fewer output bits Running time (per bit): constant number of tries, expected

Simple extractor for von Neumann Sources

Case r2i r2i+1: 01: output 0 10: output 1 *: discard

11

slide-73
SLIDE 73

Extraction for von Neumann sources Perfectly random output Fewer output bits Running time (per bit): constant number of tries, expected Can be generalized to sources which are (hidden) Markov chains

Simple extractor for von Neumann Sources

Case r2i r2i+1: 01: output 0 10: output 1 *: discard

11

slide-74
SLIDE 74

Extractor for SV sources?

12

slide-75
SLIDE 75

Extractor for SV sources?

No simple extractor, for even one bit output

12

slide-76
SLIDE 76

Extractor for SV sources?

No simple extractor, for even one bit output For any extractor, can find an SV-source on which the extractor “fails”

12

slide-77
SLIDE 77

Extractor for SV sources?

No simple extractor, for even one bit output For any extractor, can find an SV-source on which the extractor “fails” Output bias no better than input bias

12

slide-78
SLIDE 78

Extractor for SV sources?

No simple extractor, for even one bit output For any extractor, can find an SV-source on which the extractor “fails” Output bias no better than input bias Exercise

12

slide-79
SLIDE 79

Randomized Extractors

13

slide-80
SLIDE 80

Randomized Extractors

Randomized extractor

13

slide-81
SLIDE 81

Randomized Extractors

Randomized extractor Some perfect randomness as a catalyst

13

slide-82
SLIDE 82

Randomized Extractors

Randomized extractor Some perfect randomness as a catalyst

Ext

Biased input Almost unbiased

  • utput

13

slide-83
SLIDE 83

Randomized Extractors

Randomized extractor Some perfect randomness as a catalyst

Ext

Biased input Almost unbiased

  • utput

Seed randomness

13

slide-84
SLIDE 84

Randomized Extractors

Randomized extractor Some perfect randomness as a catalyst Running a BPP algorithm with

  • nly the imperfect source

Ext

Biased input Almost unbiased

  • utput

Seed randomness

13

slide-85
SLIDE 85

Randomized Extractors

Randomized extractor Some perfect randomness as a catalyst Running a BPP algorithm with

  • nly the imperfect source

Draw one string from the biased source and generate random tapes, one for each seed. If the algorithm accepts on more than half the random tapes, accept.

Ext

Biased input Almost unbiased

  • utput

Seed randomness

13

slide-86
SLIDE 86

Randomized Extractors

Randomized extractor Some perfect randomness as a catalyst Running a BPP algorithm with

  • nly the imperfect source

Draw one string from the biased source and generate random tapes, one for each seed. If the algorithm accepts on more than half the random tapes, accept. Polynomial time, if seed logarithmically short

Ext

Biased input Almost unbiased

  • utput

Seed randomness

13

slide-87
SLIDE 87

Randomized Extractors

Randomized extractor Some perfect randomness as a catalyst Running a BPP algorithm with

  • nly the imperfect source

Draw one string from the biased source and generate random tapes, one for each seed. If the algorithm accepts on more than half the random tapes, accept. Polynomial time, if seed logarithmically short Error probability remains bounded [Exercise]

Ext

Biased input Almost unbiased

  • utput

Seed randomness

13

slide-88
SLIDE 88

Extractor for SV sources

14

slide-89
SLIDE 89

Extractor for SV sources

Randomized extractor

14

slide-90
SLIDE 90

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1

14

slide-91
SLIDE 91

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 R1, R2,...

14

slide-92
SLIDE 92

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 Plan: to get to a small (conditional) bias (O(1/m)) for each output bit. R1, R2,...

14

slide-93
SLIDE 93

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 Plan: to get to a small (conditional) bias (O(1/m)) for each output bit. Weak extraction R1, R2,...

14

slide-94
SLIDE 94

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 Plan: to get to a small (conditional) bias (O(1/m)) for each output bit. Weak extraction R1, R2,... S

14

slide-95
SLIDE 95

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 Plan: to get to a small (conditional) bias (O(1/m)) for each output bit. Weak extraction R1, R2,... S a1, a2,...

14

slide-96
SLIDE 96

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 Plan: to get to a small (conditional) bias (O(1/m)) for each output bit. Weak extraction ai = <Ri,S> R1, R2,... S a1, a2,...

14

slide-97
SLIDE 97

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 Plan: to get to a small (conditional) bias (O(1/m)) for each output bit. Weak extraction Using seed-length d = O(log m) ai = <Ri,S> R1, R2,... S a1, a2,...

14

slide-98
SLIDE 98

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 Plan: to get to a small (conditional) bias (O(1/m)) for each output bit. Weak extraction Using seed-length d = O(log m) Analysis: Need to bound only the collision probability for an input block of length d [Exercise] ai = <Ri,S> R1, R2,... S a1, a2,...

14

slide-99
SLIDE 99

Extractor for SV sources

Randomized extractor Input: SV(δ) for a constant δ<1 Plan: to get to a small (conditional) bias (O(1/m)) for each output bit. Weak extraction Using seed-length d = O(log m) Analysis: Need to bound only the collision probability for an input block of length d [Exercise] Collision prob ≤ max prob ≤ (1/2 + δ/2)d = 1/poly(m) ai = <Ri,S> R1, R2,... S a1, a2,...

14

slide-100
SLIDE 100

Extractors

15

slide-101
SLIDE 101

Extractors

Extractors with logarithmic seed-length known for more general classes of sources (block sources)

15

slide-102
SLIDE 102

Extractors

Extractors with logarithmic seed-length known for more general classes of sources (block sources) Which extract “almost all” the entropy in the input

15

slide-103
SLIDE 103

Extractors

Extractors with logarithmic seed-length known for more general classes of sources (block sources) Which extract “almost all” the entropy in the input Output can be made “arbitrarily close” to uniform

15

slide-104
SLIDE 104

Extractors

Extractors with logarithmic seed-length known for more general classes of sources (block sources) Which extract “almost all” the entropy in the input Output can be made “arbitrarily close” to uniform Bottom line: Can efficiently run BPP algorithms using very general classes of sources of randomness

15

slide-105
SLIDE 105

Extracting from independent sources

16

slide-106
SLIDE 106

Extracting from independent sources

Simple (deterministic) extraction possible!

16

slide-107
SLIDE 107

Extracting from independent sources

Simple (deterministic) extraction possible!

16

slide-108
SLIDE 108

Extracting from independent sources

Simple (deterministic) extraction possible!

16

slide-109
SLIDE 109

Extracting from independent sources

Simple (deterministic) extraction possible! R

16

slide-110
SLIDE 110

Extracting from independent sources

Simple (deterministic) extraction possible! R

16

slide-111
SLIDE 111

Extracting from independent sources

Simple (deterministic) extraction possible! R S

16

slide-112
SLIDE 112

Extracting from independent sources

Simple (deterministic) extraction possible! R S

16

slide-113
SLIDE 113

Extracting from independent sources

Simple (deterministic) extraction possible! R S a

16

slide-114
SLIDE 114

Extracting from independent sources

Simple (deterministic) extraction possible! a = <R,S> R S a

16

slide-115
SLIDE 115

Extracting from independent sources

Simple (deterministic) extraction possible! Challenge: extract almost all the entropy from two independent sources a = <R,S> R S a

16

slide-116
SLIDE 116

Extracting from independent sources

Simple (deterministic) extraction possible! Challenge: extract almost all the entropy from two independent sources Known, with a few more sources a = <R,S> R S a

16

slide-117
SLIDE 117

Today

17

slide-118
SLIDE 118

Today

Efficient soundness amplification using expanders

17

slide-119
SLIDE 119

Today

Efficient soundness amplification using expanders Imperfect random sources

17

slide-120
SLIDE 120

Today

Efficient soundness amplification using expanders Imperfect random sources von Neumann, SV, and more

17

slide-121
SLIDE 121

Today

Efficient soundness amplification using expanders Imperfect random sources von Neumann, SV, and more Extractors

17

slide-122
SLIDE 122

Today

Efficient soundness amplification using expanders Imperfect random sources von Neumann, SV, and more Extractors For von Neumann, SV sources and more

17

slide-123
SLIDE 123

Today

Efficient soundness amplification using expanders Imperfect random sources von Neumann, SV, and more Extractors For von Neumann, SV sources and more Can extract almost all entropy into almost uniform output using log seed-length

17

slide-124
SLIDE 124

Today

Efficient soundness amplification using expanders Imperfect random sources von Neumann, SV, and more Extractors For von Neumann, SV sources and more Can extract almost all entropy into almost uniform output using log seed-length Closely related to other tools: pseudorandomness generators, list decodable codes

17

slide-125
SLIDE 125

Today

Efficient soundness amplification using expanders Imperfect random sources von Neumann, SV, and more Extractors For von Neumann, SV sources and more Can extract almost all entropy into almost uniform output using log seed-length Closely related to other tools: pseudorandomness generators, list decodable codes Useful in “derandomization”

17