Shannon, Turing and Hats: Information Theory Incompleteness - - PowerPoint PPT Presentation

shannon turing and hats information theory incompleteness
SMART_READER_LITE
LIVE PREVIEW

Shannon, Turing and Hats: Information Theory Incompleteness - - PowerPoint PPT Presentation

Shannon, Turing and Hats: Information Theory Incompleteness Francois Durand, Fabien Mathieu, Philippe Jacquet WITMSE 2017 September 13 th , 2017 Roadmap 1 Getting Rich without a DeLorean 2 The Shannon Approach 3 Days of infinite past 2 /


slide-1
SLIDE 1

Shannon, Turing and Hats: Information Theory Incompleteness

Francois Durand, Fabien Mathieu, Philippe Jacquet

WITMSE 2017

September 13th, 2017

slide-2
SLIDE 2

Roadmap

1

Getting Rich without a DeLorean

2

The Shannon Approach

3

Days of infinite past

2/21

slide-3
SLIDE 3

Roadmap

1

Getting Rich without a DeLorean

2

The Shannon Approach

3

Days of infinite past

3/21

slide-4
SLIDE 4

Back to back to the future

Plot of episode 2

A poor guy A Time Machine An Almanac A rich guy What if no Time Machine?

4/21

slide-5
SLIDE 5

Back to back to the future

Plot of episode 2

A poor guy A Time Machine An Almanac A rich guy What if no Time Machine?

4/21

slide-6
SLIDE 6

Back to back to the future

Plot of episode 2

A poor guy A Time Machine An Almanac A rich guy What if no Time Machine?

4/21

slide-7
SLIDE 7

Back to back to the future

Plot of episode 2

A poor guy A Time Machine An Almanac A rich guy What if no Time Machine?

4/21

slide-8
SLIDE 8

Back to back to the future

Plot of episode 2

A poor guy A Time Machine An Almanac A rich guy What if no Time Machine?

4/21

slide-9
SLIDE 9

YMCA LMCA problem

Biff is a huge fan of English soccer

He wants to bet on one of his 4 favorite teams He gathers all statistics of last century Can he predict next year champion?

5/21

slide-10
SLIDE 10

YMCA LMCA problem

Biff is a huge fan of English soccer

He wants to bet on one of his 4 favorite teams He gathers all statistics of last century Can he predict next year champion?

5/21

slide-11
SLIDE 11

YMCA LMCA problem

Biff is a huge fan of English soccer

He wants to bet on one of his 4 favorite teams He gathers all statistics of last century Can he predict next year champion?

5/21

slide-12
SLIDE 12

YMCA LMCA problem

Biff is a huge fan of English soccer

He wants to bet on one of his 4 favorite teams He gathers all statistics of last century Can he predict next year champion?

5/21

slide-13
SLIDE 13

Roadmap

1

Getting Rich without a DeLorean

2

The Shannon Approach

3

Days of infinite past

6/21

slide-14
SLIDE 14

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... 2 3 7 10 7 13

7/21

slide-15
SLIDE 15

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... X18 X20 X6 X13 1 1 1 1 2 3 7 10 7 13

Fully deterministic

Biff Best success rate: 100 100 Entropy:

7/21

slide-16
SLIDE 16

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... X18 X20 X6 X13 1 1 1 1 2 3 7 10 7 13

Fully deterministic

Biff Best success rate: 100/100 Entropy:

7/21

slide-17
SLIDE 17

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... 1/3 3/10 1 6/13 2/3 7/10 7/13

With a one-year memory

Biff Best success rate: 68.4 100 Entropy: 0.83

7/21

slide-18
SLIDE 18

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... 1/3 3/10 1 6/13 2/3 7/10 7/13

With a one-year memory

Biff Best success rate: 68.4/100 Entropy: 0.83

7/21

slide-19
SLIDE 19

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... 2 3 7 10 7 13 , 18/57 , 20/57 , 6/57 , 20/57

Memoryless

Biff Best success rate: 20 57 35 100 Entropy: 1.88

7/21

slide-20
SLIDE 20

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... 2 3 7 10 7 13 , 18/57 , 20/57 , 6/57 , 20/57

Memoryless

Biff Best success rate: 20/57 ≈ 35/100 Entropy: 1.88

7/21

slide-21
SLIDE 21

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... 2 3 7 10 7 13 , 1/4 , 1/4 , 1/4 , 1/4

Uniformly Memoryless

Biff Best success rate: 1 4 Entropy: 2

7/21

slide-22
SLIDE 22

Prediction of a LCMA sequence

Shannon: the champion sequence is a signal

With a long enough sequence, Biff can try to predict things. Statistics example: ( , , , ) = (18, 20, 6, 13) Premier league may be a Markov source... 2 3 7 10 7 13 , 1/4 , 1/4 , 1/4 , 1/4

Uniformly Memoryless

Biff Best success rate: 1/4 Entropy: 2

7/21

slide-23
SLIDE 23

Biff problem

If the source is uniformly memoryless...

Biff cannot go past 1 4: not enough to get rich. No matter how much data he gathers. What if past is infinite?

8/21

slide-24
SLIDE 24

Biff problem

If the source is uniformly memoryless...

Biff cannot go past 1 4: not enough to get rich. No matter how much data he gathers. What if past is infinite?

8/21

slide-25
SLIDE 25

Biff problem

If the source is uniformly memoryless...

Biff cannot go past 1/4: not enough to get rich. No matter how much data he gathers. What if past is infinite?

8/21

slide-26
SLIDE 26

Biff problem

If the source is uniformly memoryless...

Biff cannot go past 1/4: not enough to get rich. No matter how much data he gathers. What if past is infinite?

8/21

slide-27
SLIDE 27

Biff problem

If the source is uniformly memoryless...

Biff cannot go past 1/4: not enough to get rich. No matter how much data he gathers. What if past is infinite?

8/21

slide-28
SLIDE 28

Roadmap

1

Getting Rich without a DeLorean

2

The Shannon Approach

3

Days of infinite past

9/21

slide-29
SLIDE 29

Infinite past model

We'll make the following assumptions

Past is infinite: no big bang. Biff, Arsenal, Chelsea, Liverpool and Manchester are immortal. Year champion: uniform memoryless. Universe ends at year 576,000,000,000 AD.

10/21

slide-30
SLIDE 30

Infinite past model

We'll make the following assumptions

Past is infinite: no big bang. Biff, Arsenal, Chelsea, Liverpool and Manchester are immortal. Year champion: uniform memoryless. Universe ends at year 576,000,000,000 AD. Today 2017 Infinite past

  • infinity

End 576,000,000,000

10/21

slide-31
SLIDE 31

Infinite past model

We'll make the following assumptions

Past is infinite: no big bang. Biff, Arsenal, Chelsea, Liverpool and Manchester are immortal. Year champion: uniform memoryless. Universe ends at year 576,000,000,000 AD.

10/21

slide-32
SLIDE 32

Infinite past model

We'll make the following assumptions

Past is infinite: no big bang. Biff, Arsenal, Chelsea, Liverpool and Manchester are immortal. Year champion: uniform memoryless. Universe ends at year 576,000,000,000 AD. , 1/4 , 1/4 , 1/4 , 1/4

10/21

slide-33
SLIDE 33

Infinite past model

We'll make the following assumptions

Past is infinite: no big bang. Biff, Arsenal, Chelsea, Liverpool and Manchester are immortal. Year champion: uniform memoryless. Universe ends at year 576,000,000,000 AD. Today 2017 Infinite past

  • infinity

End 576,000,000,000

10/21

slide-34
SLIDE 34

Reminder: Hat puzzles

Hat puzzles

A bunch of people with Hats. You do not know the color of your own Hat. Guess your color by looking at a subset of others' hats.

Exist in multiple flavors

With or without additional information. Finite or infinite population.

11/21

slide-35
SLIDE 35

Reminder: Hat puzzles

Hat puzzles

A bunch of people with Hats. You do not know the color of your own Hat. Guess your color by looking at a subset of others' hats.

Exist in multiple flavors

With or without additional information. Finite or infinite population.

11/21

slide-36
SLIDE 36

Reminder: Hat puzzles

Hat puzzles

A bunch of people with Hats. You do not know the color of your own Hat. Guess your color by looking at a subset of others' hats.

Exist in multiple flavors

With or without additional information. Finite or infinite population.

11/21

slide-37
SLIDE 37

Reminder: Hat puzzles

Hat puzzles

A bunch of people with Hats. You do not know the color of your own Hat. Guess your color by looking at a subset of others' hats.

Exist in multiple flavors

With or without additional information. Finite or infinite population.

11/21

slide-38
SLIDE 38

Reminder: Hat puzzles

Hat puzzles

A bunch of people with Hats. You do not know the color of your own Hat. Guess your color by looking at a subset of others' hats.

Exist in multiple flavors

With or without additional information. Finite or infinite population.

11/21

slide-39
SLIDE 39

Reminder: Hat puzzles

Hat puzzles

A bunch of people with Hats. You do not know the color of your own Hat. Guess your color by looking at a subset of others' hats.

Exist in multiple flavors

With or without additional information. Finite or infinite population.

11/21

slide-40
SLIDE 40

Reminder: Hat puzzles

Hat puzzles

A bunch of people with Hats. You do not know the color of your own Hat. Guess your color by looking at a subset of others' hats.

Exist in multiple flavors

With or without additional information. Finite or infinite population.

11/21

slide-41
SLIDE 41

Reminder: Hat puzzles

Hat puzzles

A bunch of people with Hats. You do not know the color of your own Hat. Guess your color by looking at a subset of others' hats.

Exist in multiple flavors

With or without additional information. Finite or infinite population.

11/21

slide-42
SLIDE 42

Axiom of choice

Definition

two infinite sequences of winners X and Y are equivalent if they differ only for a finite number of years.

Axiom of choice

Biff can choose one representative sequence per equivalence class.

12/21

slide-43
SLIDE 43

Axiom of choice

Definition

two infinite sequences of winners X and Y are equivalent if they differ only for a finite number of years.

Axiom of choice

Biff can choose one representative sequence per equivalence class.

12/21

slide-44
SLIDE 44

Axiom of choice

Definition

two infinite sequences of winners X and Y are equivalent if they differ only for a finite number of years.

Axiom of choice

Biff can choose one representative sequence per equivalence class.

12/21

slide-45
SLIDE 45

Biff prediction algorithm

Build a sequence by concatenation of observed past and Manchester-padding. Take the representative of the equivalence class. Announce the result predicted by the representative.

13/21

slide-46
SLIDE 46

Biff prediction algorithm

Build a sequence by concatenation of observed past and Manchester-padding. Take the representative of the equivalence class. Announce the result predicted by the representative.

13/21

slide-47
SLIDE 47

Biff prediction algorithm

Build a sequence by concatenation of observed past and Manchester-padding. Take the representative of the equivalence class. Announce the result predicted by the representative.

13/21

slide-48
SLIDE 48

Biff prediction algorithm

Build a sequence by concatenation of observed past and Manchester-padding. Take the representative of the equivalence class. Announce the result predicted by the representative.

13/21

slide-49
SLIDE 49

Biff is never wrong but a finite number of times

Assume any reality

Biff always guesses the same sequence. The sequence differs from reality only for a finite number of years. Biff is wrong only a finite number of times. Success rate: 1. Result does not depend on the way the sequence is drawn! True even with maximal entropy 2

14/21

slide-50
SLIDE 50

Biff is never wrong but a finite number of times

Assume any reality

Biff always guesses the same sequence. The sequence differs from reality only for a finite number of years. Biff is wrong only a finite number of times. Success rate: 1. Result does not depend on the way the sequence is drawn! True even with maximal entropy 2

14/21

slide-51
SLIDE 51

Biff is never wrong but a finite number of times

Assume any reality

Biff always guesses the same sequence. The sequence differs from reality only for a finite number of years. Biff is wrong only a finite number of times. Success rate: 1. Result does not depend on the way the sequence is drawn! True even with maximal entropy 2

14/21

slide-52
SLIDE 52

Biff is never wrong but a finite number of times

Assume any reality

Biff always guesses the same sequence. The sequence differs from reality only for a finite number of years. Biff is wrong only a finite number of times. Success rate: 1. Result does not depend on the way the sequence is drawn! True even with maximal entropy 2

14/21

slide-53
SLIDE 53

Biff is never wrong but a finite number of times

Assume any reality

Biff always guesses the same sequence. The sequence differs from reality only for a finite number of years. Biff is wrong only a finite number of times. Success rate: 1. Result does not depend on the way the sequence is drawn! True even with maximal entropy 2

14/21

slide-54
SLIDE 54

Biff is never wrong but a finite number of times

Assume any reality

Biff always guesses the same sequence. The sequence differs from reality only for a finite number of years. Biff is wrong only a finite number of times. Success rate: 1. Result does not depend on the way the sequence is drawn! True even with maximal entropy 2

14/21

slide-55
SLIDE 55

Biff is never wrong but a finite number of times

Assume any reality

Biff always guesses the same sequence. The sequence differs from reality only for a finite number of years. Biff is wrong only a finite number of times. Success rate: 1. Result does not depend on the way the sequence is drawn! True even with maximal entropy 2

14/21

slide-56
SLIDE 56

Biff is never wrong but a finite number of times

Assume any reality

Biff always guesses the same sequence. The sequence differs from reality only for a finite number of years. Biff is wrong only a finite number of times. Success rate: 1. Result does not depend on the way the sequence is drawn! True even with maximal entropy 2

14/21

slide-57
SLIDE 57

Biff is wrong most of the time

The decision of 2017-Biff does not depend on the 2018-champion. Deferred decision: choose 2018-champion after the decision of 2017-Biff. Biff is wrong with probability 3 4.

15/21

slide-58
SLIDE 58

Biff is wrong most of the time

The decision of 2017-Biff does not depend on the 2018-champion. Deferred decision: choose 2018-champion after the decision of 2017-Biff. Biff is wrong with probability 3 4.

15/21

slide-59
SLIDE 59

Biff is wrong most of the time

The decision of 2017-Biff does not depend on the 2018-champion. Deferred decision: choose 2018-champion after the decision of 2017-Biff. Biff is wrong with probability 3 4.

15/21

slide-60
SLIDE 60

Biff is wrong most of the time

The decision of 2017-Biff does not depend on the 2018-champion. Deferred decision: choose 2018-champion after the decision of 2017-Biff. Biff is wrong with probability 3/4.

15/21

slide-61
SLIDE 61

Paradox explained

If there is a probability that Biff is wrong at year t, it must be 3 4. Hence the probability that Biff is wrong infinitely often is at least 3 4. But Biff is always wrong only for a finite number of times: probability

  • f being infinitely wrong is 0.

Conclusion

The event Biff is wrong cannot be measured.

16/21

slide-62
SLIDE 62

Paradox explained

If there is a probability that Biff is wrong at year t, it must be 3/4. Hence the probability that Biff is wrong infinitely often is at least 3 4. But Biff is always wrong only for a finite number of times: probability

  • f being infinitely wrong is 0.

Conclusion

The event Biff is wrong cannot be measured.

16/21

slide-63
SLIDE 63

Paradox explained

If there is a probability that Biff is wrong at year t, it must be 3/4. Hence the probability that Biff is wrong infinitely often is at least 3/4. But Biff is always wrong only for a finite number of times: probability

  • f being infinitely wrong is 0.

Conclusion

The event Biff is wrong cannot be measured.

16/21

slide-64
SLIDE 64

Paradox explained

If there is a probability that Biff is wrong at year t, it must be 3/4. Hence the probability that Biff is wrong infinitely often is at least 3/4. But Biff is always wrong only for a finite number of times: probability

  • f being infinitely wrong is 0.

Conclusion

The event Biff is wrong cannot be measured.

16/21

slide-65
SLIDE 65

Paradox explained

If there is a probability that Biff is wrong at year t, it must be 3/4. Hence the probability that Biff is wrong infinitely often is at least 3/4. But Biff is always wrong only for a finite number of times: probability

  • f being infinitely wrong is 0.

Conclusion

The event Biff is wrong cannot be measured.

16/21

slide-66
SLIDE 66

Getting rid of the axiom of choice

God doesn't play dice with the universe Assume Champions are produced by a Backward Turing Machine Keep inference from the past Axiom of choice is removed

17/21

slide-67
SLIDE 67

Getting rid of the axiom of choice

God doesn't play dice with the universe Assume Champions are produced by a Backward Turing Machine Keep inference from the past Axiom of choice is removed

17/21

slide-68
SLIDE 68

Getting rid of the axiom of choice

God doesn't play dice with the universe Assume Champions are produced by a Backward Turing Machine Keep inference from the past Axiom of choice is removed Today 2017 Infinite past

  • infinity

End 576,000,000,000

17/21

slide-69
SLIDE 69

Getting rid of the axiom of choice

God doesn't play dice with the universe Assume Champions are produced by a Backward Turing Machine Keep inference from the past Axiom of choice is removed Today 2017 Infinite past

  • infinity

End 576,000,000,000

17/21

slide-70
SLIDE 70

Getting rid of the axiom of choice

God doesn't play dice with the universe Assume Champions are produced by a Backward Turing Machine Keep inference from the past Axiom of choice is removed Today 2017 Infinite past

  • infinity

End 576,000,000,000

17/21

slide-71
SLIDE 71

Biff in a multiverse of Turing Machines

The set of Turing machines is enumerable Biff selects the first machine with finite errors Same reasoning as before

Biff always guesses the same sequence He is wrong only a finite number of times 18/21

slide-72
SLIDE 72

Biff in a multiverse of Turing Machines

The set of Turing machines is enumerable Biff selects the first machine with finite errors Same reasoning as before

Biff always guesses the same sequence He is wrong only a finite number of times 18/21

slide-73
SLIDE 73

Biff in a multiverse of Turing Machines

The set of Turing machines is enumerable Biff selects the first machine with finite errors Same reasoning as before

Biff always guesses the same sequence He is wrong only a finite number of times 18/21

slide-74
SLIDE 74

Biff in a multiverse of Turing Machines

The set of Turing machines is enumerable Biff selects the first machine with finite errors Same reasoning as before

Biff always guesses the same sequence He is wrong only a finite number of times 18/21

slide-75
SLIDE 75

Biff in a multiverse of Turing Machines

The set of Turing machines is enumerable Biff selects the first machine with finite errors Same reasoning as before

◮ Biff always guesses the same sequence ◮ He is wrong only a finite number of times

18/21

slide-76
SLIDE 76

Paradox of Turing Machines multiverse

Assume a probability distribution over Turing machines. The probability that Biff is wrong at year t is now perfectly defined. Probability must go to 0 has we travel to the past

Conclusion

Biff gets wronger and wronger has the end is near: The less Biff knows, the better he predicts.

19/21

slide-77
SLIDE 77

Paradox of Turing Machines multiverse

Assume a probability distribution over Turing machines. The probability that Biff is wrong at year t is now perfectly defined. Probability must go to 0 has we travel to the past

Conclusion

Biff gets wronger and wronger has the end is near: The less Biff knows, the better he predicts.

19/21

slide-78
SLIDE 78

Paradox of Turing Machines multiverse

Assume a probability distribution over Turing machines. The probability that Biff is wrong at year t is now perfectly defined. Probability must go to 0 has we travel to the past

Conclusion

Biff gets wronger and wronger has the end is near: The less Biff knows, the better he predicts.

19/21

slide-79
SLIDE 79

Paradox of Turing Machines multiverse

Assume a probability distribution over Turing machines. The probability that Biff is wrong at year t is now perfectly defined. Probability must go to 0 has we travel to the past

Conclusion

Biff gets wronger and wronger has the end is near: The less Biff knows, the better he predicts.

19/21

slide-80
SLIDE 80

Paradox of Turing Machines multiverse

Assume a probability distribution over Turing machines. The probability that Biff is wrong at year t is now perfectly defined. Probability must go to 0 has we travel to the past

Conclusion

Biff gets wronger and wronger has the end is near: The less Biff knows, the better he predicts.

19/21

slide-81
SLIDE 81

Conclusion

Takeaway

Information theory cannot be straightforwardly adapted to infinity. Is there a proper framework? Very early work with lot of open questions, e.g. infinite future? Best way to get rich...

20/21

slide-82
SLIDE 82

Conclusion

Takeaway

Information theory cannot be straightforwardly adapted to infinity. Is there a proper framework? Very early work with lot of open questions, e.g. infinite future? Best way to get rich...

20/21

slide-83
SLIDE 83

Thank you!

21/21