Anthropic Decision Theory I think, I am, therefore I am - - PowerPoint PPT Presentation

anthropic decision theory
SMART_READER_LITE
LIVE PREVIEW

Anthropic Decision Theory I think, I am, therefore I am - - PowerPoint PPT Presentation

Anthropic Decision Theory I think, I am, therefore I am therefore... I do? Why anthropic decisions make sense, but anthropic probabilitjes dont. A n t h r o p i c q u e s tj o n s ? ? ? ? Humanity on Earth implies... ...what


slide-1
SLIDE 1

Anthropic Decision Theory

Why anthropic decisions make sense, but anthropic probabilitjes don’t.

I think, therefore I am I am, therefore... I do?

slide-2
SLIDE 2

Anthropic questjons

Humanity on Earth implies...

⇒ ? ? ? ?

...what about the universe?

slide-3
SLIDE 3

Tails Heads Amnesia Zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz... Zzzz... Zzzzzz... Sunday Monday Tuesday Zzzz... Zzzz...

Sleeping Beauty I Amnesia

slide-4
SLIDE 4

Tails Heads Amnesia Zzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzzz... Zzzz... Zzzzzz... Sunday Monday Tuesday Zzzz... Zzzz...

Sleeping Beauty I Amnesia

Upon awakening, what is the probability of Heads? Of Monday?

slide-5
SLIDE 5

Tails Heads Room 1 Room 2 Room 1 Room 2

Sleeping Beauty II Incubator

Upon awakening, what is the probability of Heads? Of Room1?

slide-6
SLIDE 6

Standard resolutjons: probability

  • Halfer positjon: 1/2 on heads.
slide-7
SLIDE 7

Standard resolutjons: probability

  • Halfer positjon: 1/2 on heads.

Those are the initjal odds. And you learn nothing new: no update.

slide-8
SLIDE 8

Standard resolutjons: probability

  • Halfer positjon: 1/2 on heads.

Those are the initjal odds. And you learn nothing new: no update.

  • Thirder positjon: 1/3 on heads.
slide-9
SLIDE 9

Standard resolutjons: probability

  • Halfer positjon: 1/2 on heads.

Those are the initjal odds. And you learn nothing new: no update.

  • Thirder positjon: 1/3 on heads.

Because “(Monday, heads)”, “(Monday, tails)”, and “(Tuesday, tails)” are indistjnguishable.

slide-10
SLIDE 10

Standard resolutjons: probability

  • Halfer positjon: 1/2 on heads.

Those are the initjal odds. And you learn nothing new: no update.

  • Thirder positjon: 1/3 on heads.

Because “(Monday, heads)”, “(Monday, tails)”, and “(Tuesday, tails)” are indistjnguishable. “(Tuesday, heads)” must tell you something.

slide-11
SLIDE 11

Standard resolutjons: probability

  • Halfer positjon: 1/2 on heads.

Self-Sampling Assumptjon (SSA): An observer is randomly selected from the set of all actually existent observers in their reference class.

  • Thirder positjon: 1/3 on heads.

Self-Indicatjon Assumptjon (SIA): An observer is randomly selected from the set of all possible observers.

slide-12
SLIDE 12

Standard resolutjons: probability

  • Halfer positjon: 1/2 on heads.

Self-Sampling Assumptjon (SSA): An observer is randomly selected from the set of all actually existent observers in their reference class.

  • Thirder positjon: 1/3 on heads.

Self-Indicatjon Assumptjon (SIA): An observer is randomly selected from the set of all possible observers.

slide-13
SLIDE 13

Standard resolutjons: probability

  • Halfer positjon: 1/2 on heads.

Self-Sampling Assumptjon (SSA): An observer is randomly selected from the set of all actually existent observers in their reference class.

  • Thirder positjon: 1/3 on heads.

Self-Indicatjon Assumptjon (SIA): An observer is randomly selected from the set of all possible observers.

slide-14
SLIDE 14

Adam and Eve paradox

SSA prefers small universes (present and future)

slide-15
SLIDE 15

?

Adam and Eve paradox

SSA prefers small universes (present and future)

slide-16
SLIDE 16

?

Adam and Eve paradox

SSA prefers small universes (present and future)

slide-17
SLIDE 17

Adam and Eve paradox

SSA prefers small universes (present and future)

?

slide-18
SLIDE 18

Doomsday argument

SSA prefers small universes (present and future)

slide-19
SLIDE 19

Presumptuous philosopher

SIA prefers large universes (present, not future)

slide-20
SLIDE 20

Presumptuous philosopher

Λ=?

SIA prefers large universes (present, not future)

slide-21
SLIDE 21

Presumptuous philosopher

Λ=?

SIA prefers large universes (present, not future)

slide-22
SLIDE 22

Presumptuous philosopher

Λ=?

I know!!!

SIA prefers large universes (present, not future)

slide-23
SLIDE 23

Presumptuous philosopher

Λ=?

I know!!!

SIA prefers large universes (present, not future)

slide-24
SLIDE 24

Presumptuous philosopher

SIA prefers large universes (present, not future)

Λ=?

I know!!!

I’ll bet you at odds

  • f a trillion to one on

the trillion tjmes bigger universe

slide-25
SLIDE 25

Presumptuous philosopher

Λ=?

I know!!!

I’ll bet you at odds

  • f a trillion to one on

the trillion tjmes bigger universe You can’t produce enough evidence to change my mind SIA prefers large universes (present, not future)

slide-26
SLIDE 26

Is anthropics the problem?

Tails Heads Room 1 Room 2 Room 1 Room 2 Psy-Kosh’s non-anthropic problem

slide-27
SLIDE 27

Is anthropics the problem?

Tails Heads Room 1 Room 2 Room 1 Room 2 Psy-Kosh’s non-anthropic problem

slide-28
SLIDE 28

Is anthropics the problem?

1 decider: gain if guess heads Tails Heads Room 1 Room 2 Room 1 Room 2 Psy-Kosh’s non-anthropic problem

slide-29
SLIDE 29

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 Psy-Kosh’s non-anthropic problem

slide-30
SLIDE 30

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 If I say tails, she says... Psy-Kosh’s non-anthropic problem

slide-31
SLIDE 31

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 If I say tails, she says...

...

Psy-Kosh’s non-anthropic problem

slide-32
SLIDE 32

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 If I say tails, she says...

...

Psy-Kosh’s non-anthropic problem Evidentjal Decision Theory Causal Decision Theory

slide-33
SLIDE 33

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 If I say tails, she says...

...

How much do I care about her, anyway? Psy-Kosh’s non-anthropic problem Evidentjal Decision Theory Causal Decision Theory

slide-34
SLIDE 34

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 If I say tails, she says...

...

How much do I care about her, anyway? Psy-Kosh’s non-anthropic problem Evidentjal Decision Theory Causal Decision Theory Altruistjc Selfjsh (precommit?)

slide-35
SLIDE 35

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 If I say tails, she says...

...

How much do I care about her, anyway? Do I do this,

  • r did we do it

together? Psy-Kosh’s non-anthropic problem Evidentjal Decision Theory Causal Decision Theory Altruistjc Selfjsh (precommit?)

slide-36
SLIDE 36

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 If I say tails, she says...

...

How much do I care about her, anyway? Do I do this,

  • r did we do it

together? Psy-Kosh’s non-anthropic problem Evidentjal Decision Theory Causal Decision Theory Altruistjc Selfjsh (precommit?) Total responsibility Partjal responsibility

slide-37
SLIDE 37

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 Psy-Kosh’s non-anthropic problem Evidentjal Decision Theory Causal Decision Theory Altruistjc Selfjsh (precommit?) Total responsibility Partjal responsibility

slide-38
SLIDE 38

Is anthropics the problem?

1 decider: gain if guess heads 2 deciders: gain if both guess tails Tails Heads Room 1 Room 2 Room 1 Room 2 Psy-Kosh’s non-anthropic problem Evidentjal Decision Theory Causal Decision Theory Altruistjc Selfjsh (precommit?) Total responsibility Partjal responsibility SIA SSA

slide-39
SLIDE 39

Anthropic probabilitjes don’t really make sense

Frequentjsm:

... ... ...

slide-40
SLIDE 40

Anthropic probabilitjes don’t really make sense

Frequentjsm:

? ? ?? ? ?

How many tjmes were you right (SIA)? vs How many experiments were you right in (SSA)?

... ... ...

slide-41
SLIDE 41

Anthropic probabilitjes don’t really make sense

Bayesianism:

? ? ?? ? ?

slide-42
SLIDE 42

Anthropic probabilitjes don’t really make sense

Bayesianism:

? ? ?? ? ?

Uncertain about the world with you in it (SSA)? vs Uncertain about you in the world (SIA)?

slide-43
SLIDE 43

Anthropic probabilitjes don’t really make sense

Subjectjve credences and expectatjons: These were forged by evolutjon in non-anthropic situatjons.

slide-44
SLIDE 44

The morals of the talk

Sleeping Beauty problem is underdefjned – need Beauty’s values. Even without anthropic probabilitjes, we can stjll reach the right decision.

slide-45
SLIDE 45

Decisions and values, not probabilitjes

Upon each awakening, Beauty is ofgered a coupon at £X that pays £1 if the coin was tails.

slide-46
SLIDE 46

Decisions and values, not probabilitjes

Upon each awakening, Beauty is ofgered a coupon at £X that pays £1 if the coin was tails.

  • x

1-x 1-x

slide-47
SLIDE 47

Decisions and values, not probabilitjes

What would Sunday Beauty want?

  • x

1-x 1-x

slide-48
SLIDE 48

Decisions and values, not probabilitjes

What would Sunday Beauty want? If all cash goes towards a “cause”: X < £2/3

  • x

1-x 1-x

Expected: 0.5(-X)+0.5(1-X)2

slide-49
SLIDE 49
  • x

1-x 1-x

Decisions and values, not probabilitjes

What would Sunday Beauty want? If all cash goes towards a “cause”: X < £2/3 Axiom 1: Precommitments are possible.

Expected: 0.5(-X)+0.5(1-X)2

slide-50
SLIDE 50

Decisions and values, not probabilitjes

What would Sunday Beauty want? If cash is saved: X < £2/3 Axiom 1: Precommitments are possible.

  • x

1-x 1-x

Expected: 0.5(-X)+0.5(1-X)2

slide-51
SLIDE 51

Decisions and values, not probabilitjes

What would Sunday Beauty want? If cash buys chocolate: X < £2/3 or £1/2 Axiom 1: Precommitments are possible.

  • x

1-x 1-x

Expected: 0.5(-X)+0.5(1-X)1

slide-52
SLIDE 52

Decisions and values, not probabilitjes

What would Sunday Beauty want? If cash buys chocolate: X < £2/3 or £1/2

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian

  • x

1-x 1-x

slide-53
SLIDE 53

Decisions and values, not probabilitjes

What would Sunday Beauty want? If cash buys chocolate: X < £2/3 or £1/2 Axiom 2: Outside details are irrelevant.

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian

  • x

1-x 1-x

slide-54
SLIDE 54

Decisions and values, not probabilitjes

What would Sunday Beauty want? If cash buys chocolate: X < £2/3 or £1/2 Axiom 2: Outside details are irrelevant.

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian

  • x

1-x 1-x

slide-55
SLIDE 55

Decisions and values, not probabilitjes

What would Sunday Beauty want? If cash buys chocolate: X < £2/3 or £1/2 Axiom 2: Outside details are irrelevant.

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian

slide-56
SLIDE 56

Decisions and values, not probabilitjes

What would Sunday Beauty want? If cash buys chocolate: X < £2/3 or £1/2 Selfjsh?

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian

Expected: 0.5(-X)+0.5(1-X)1

slide-57
SLIDE 57

Decisions and values, not probabilitjes

What would Sunday Beauty want? If cash buys chocolate: X < £2/3 or £1/2 Selfjsh? Axiom 3: Spurious inside details are irrelevant.

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian

Expected: 0.5(-X)+0.5(1-X)1

slide-58
SLIDE 58

Decisions and values, not probabilitjes

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian Selfjsh (?)

slide-59
SLIDE 59

Decisions and values, not probabilitjes

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian Selfjsh (?)

Expected: 0.5(-X)/2+0.5(1-X)1

slide-60
SLIDE 60

Decisions and values, not probabilitjes

SIA-ish SSA-ish Non-indexical utjlity Copy-altruistjc total utjlitarian Copy-altruistjc average utjlitarian Selfjsh (strict???) Selfjsh (psychological approach)

Expected: 0.5(-X)/2+0.5(1-X)1

slide-61
SLIDE 61

Axioms

  • Axiom 1: Precommitments are possible.

(gives standard Sleeping Beauty for non- indexical preferences and altruists)

  • Axiom 2: Outside details are irrelevant.

(gives incubator variant of Sleeping Beauty)

  • Axiom 3: Spurious inside details are irrelevant.

(gives selfjsh preferences)

slide-62
SLIDE 62

Linked decisions

slide-63
SLIDE 63

Linked decisions

slide-64
SLIDE 64

Linked decisions

slide-65
SLIDE 65

Linked decisions

slide-66
SLIDE 66

Linked decisions

slide-67
SLIDE 67

Linked decisions

Self-confjrming linking

slide-68
SLIDE 68

Anthropic Decision Theory

Anthropic decision theory (ADT): An ADT agent searches for self-confjrming linkings (for a given decision). It then maximises expected utjlity, using standard (non-anthropic) probabilitjes, actjng as if it controlled all the agents’ linked decisions.

slide-69
SLIDE 69

Adam and Eve paradox

SSA: Probability of successful hunt is high.

slide-70
SLIDE 70

Adam and Eve paradox

SSA: Probability of successful hunt is high. Average utjlitarian: If average happiness is the same, disutjlity of failed hunt less if there are more people.

slide-71
SLIDE 71

Adam and Eve paradox

SSA: Probability of successful hunt is high. Average utjlitarian: If average happiness is the same, disutjlity of failed hunt less if there are more people. Selfjsh + precommitment + ignorance: In fjrst world, Adam and Eve sufger, but I’m unlikely to be them. In second world, Adam and Eve benefjt, and I’m certain to be one of them.

slide-72
SLIDE 72

Doomsday argument

SSA: Probability of doom is high. No future generatjons.

slide-73
SLIDE 73

Doomsday argument

SSA: Probability of doom is high. No future generatjons. What kind of bettjng behaviour are we looking for? Prefers to consume a windfall now rather than save future generatjons.

slide-74
SLIDE 74

Doomsday argument

SSA: Probability of doom is high. No future generatjons. What kind of bettjng behaviour are we looking for? Prefers to consume a windfall now rather than save future generatjons. Average utjlitarian: if future generatjons are of similar average happiness, then betuer consume windfall ω today than let Ω more people exist.

ω/Ω ≈ 0

slide-75
SLIDE 75

Presumptuous philosopher

Λ=?

SIA: The probability of the large universe is large.

slide-76
SLIDE 76

Presumptuous philosopher

Λ=?

SIA: The probability of the large universe is large.

Total utjlitarian: in a large universe, many philosophers win their bets, and I care about them.