lecture 3 stability and action
play

Lecture 3: Stability and Action Hannes Leitgeb LMU Munich October - PowerPoint PPT Presentation

Lecture 3: Stability and Action Hannes Leitgeb LMU Munich October 2014 My main question is: What do a perfectly rational agents beliefs and degrees of belief have to be like in order for them to cohere with each other? And the answer


  1. Lecture 3: Stability and Action Hannes Leitgeb LMU Munich October 2014

  2. My main question is: What do a perfectly rational agent’s beliefs and degrees of belief have to be like in order for them to cohere with each other? And the answer suggested in Lectures 1 and 2 was: The Humean thesis: Belief corresponds to stably high degree of belief. (Where this meant ‘stably high under certain conditionalizations’.)

  3. Plan: Explain what the Humean thesis tells us about practical rationality. Recover the Humean thesis from practical starting points. Bayesian and Categorical Decision-Making 1 Subjective Assertability 2 A Note on the Preface Paradox 3 Belief and Acceptance 4 Again I will focus on (inferentially) perfectly rational agents only.

  4. Bayesian and Categorical Decision-Making In Lecture 1 we derived a compatibility result for Bayesian decision-making and categorical decision-making. The very simple all-or-nothing framework was this: O is a set of outcomes. u : O → R is an “all-or-nothing” utility function that takes precisely two values u max > u min . Actions A are all the functions from W to O (very tolerant!). Use ( A ) = { w ∈ W | u ( A ( w )) = u max } (= { w ∈ W | A serves u in w } ) . An action A is (practically) permissible given Bel and u iff Bel ( Use ( A )) .

  5. One can in fact recover the Humean thesis from the logical closure of belief plus decision-theoretic coherence: Theorem The following two claims are equivalent: Bel is closed under logic, not Bel ( ∅ ) , and Bel and P are such that for all all-or-nothing utility measures u it holds that: (i) for all actions A , B: if Bel ( Use ( A )) and not Bel ( Use ( B ))) then E P ( u ( A )) > E P ( u ( B )) , (ii) for all actions A: if E P ( u ( A )) is maximal , then Bel ( Use ( A )) , and for all actions B with Bel ( Use ( B )) it holds that E P ( u ( A )) − E P ( u ( B )) < ( 1 − r )( u max − u min ) . Bel and P satisfy the Humean thesis HT r , and not Bel ( ∅ ) .

  6. That’s for Alexandru (thanks!): There is also a Poss -variant of this all-or-nothing decision theory that is compatible with Bayesian decision theory in the following sense: Theorem The following two claims are equivalent: Bel is closed under logic, not Bel ( ∅ ) , and Bel and P are such that for all all-or-nothing utility measures u it holds that: (i) for all actions A , B: if Poss ( Use ( A )) and not Poss ( Use ( B ))) then E P ( u ( A )) > E P ( u ( B )) , (ii) for all actions A: if E P ( u ( A )) is maximal , then Poss ( Use ( A )) , and for all actions B with Poss ( Use ( B )) it holds that E P ( u ( A )) − E P ( u ( B )) < r ( u max − u min ) . Bel and P satisfy the Humean thesis HT r , and not Bel ( ∅ ) .

  7. Moral: Given the logical closure of belief, it is hard to avoid the Humean thesis even on practical grounds. (If one does not aim to give a representation theorem, one can also, without much loss, restrict decision contexts to a given set of action alternatives .)

  8. Subjective Assertability When is a proposition subjectively assertable? When is an indicative conditional subjectively assertable? Ramsey (1929): If two people are arguing ‘If p will q?’ and are both in doubt as to p, they are adding p hypothetically to their stock of knowledge and arguing on that basis about q ...

  9. Subjective Assertability When is a proposition subjectively assertable? When is an indicative conditional subjectively assertable? Ramsey (1929): If two people are arguing ‘If p will q?’ and are both in doubt as to p, they are adding p hypothetically to their stock of knowledge and arguing on that basis about q ... We can say that they are fixing their degrees of belief in q given p.

  10. Adams (1965, 1966, 1975), Edgington (1995), Bennett (2003): – The (rational) degree of acceptability of an indicative conditional X → Y as being given by a person’s probability measure P (at a time t ) is P ( Y | X ) . Indicative conditionals do not themselves express propositions. Let us grant that this suppositionalist thesis is right.

  11. Adams (1965, 1966, 1975), Edgington (1995), Bennett (2003): – The (rational) degree of acceptability of an indicative conditional X → Y as being given by a person’s probability measure P (at a time t ) is P ( Y | X ) . Indicative conditionals do not themselves express propositions. Let us grant that this suppositionalist thesis is right. But that still leaves open assertability simpliciter : – When is an indicative conditional X → Y (rationally) assertable for a person (at a time t )? Under the condition that P ( Y | X ) = 1? P ( Y | X ) > r ? E ( u ( Acc ( X → Y ))) > r ? Or something completely different?

  12. By the functional role of belief, the obvious answers to our questions are X is subjectively assertable (given Bel ) iff Bel ( X ) , The (Categorical) Ramsey Test: X → Y is subjectively assertable (given Bel ) iff Bel ( Y | X ) , where Bel satisfies (with P ) the Humean thesis. In the following, I will support this by assumptions on subjective assertability. (I will abuse notation a bit: ⊤ = W , ⊥ = ∅ , ∧ = ∩ , ∨ = ∪ , ... )

  13. Logical postulates on assertability for propositions: Ass ( ⊤ ) (Taut) ¬ Ass ( ⊥ ) (Cons) B W! X ! Ass ( X ) , X ⊆ Y (Weak 1) Ass ( Y ) Ass(X) ! Y ! Ass ( X ) , Ass ( Y ) (And 1) Ass ( X ∧ Y ) ¬ Ass( ¬ Y) !

  14. Logical postulates on assertability for indicative conditionals (viewed as pairs of propositions): Ass ( X → X ) (Ref) B X! Ass ( X → Y ) , Y ⊆ Z X ! (Weak 2) Ass ( X → Z ) Ass ( X → Y ) , Ass ( X → Z ) (And 2) Ass ( X → Y ∧ Z ) Y ! Ass(X → X) ! Ass(X → Y) !

  15. The following rule is useful for indicative conditionals the antecedents of which are live possibilities (“and are ... in doubt as to p ”)—and I will focus on them in particular: Ass ( Y ) , ¬ Ass ( ¬ X ) (Pres) Ass ( X → Y ) B W! Y ! B X! Implies: Ass ( Y ) Ass ( T → Y ) Ass(Y) ! X ! ¬ Ass( ¬ X) ! Pres guarantees that there is substantial logical interaction between factual assertability and assertability of indicatives, which seems plausible. (Some applications of Pres will have to be read in terms of ‘even if’ or ‘still’.)

  16. Ass ( ⊤ → X ) Ass ( X ) Ass ( X → Y ) , Ass ( X → Z ) B W! (CM) Ass ( X ∧ Y → Z ) Y ! B X! Ass ( X → Y ) , Ass ( X ∧ Y → Z ) (CC) Ass ( X → Z ) Ass(Y) ! X ! ¬ Ass( ¬ X) ! Ass ( X → Z ) , Ass ( Y → Z ) (Or) Ass ( X ∨ Y → Z )

  17. To these closure conditions on assertability we add one bridge principle: Ass ( X → Y ) (and ¬ Ass ( ¬ X ) , P ( X ) > 0) (High Prob) P ( Y | X ) > 1 2 That is: If X → Y is assertable (where X is a live possibility in terms of Ass and P ), then the degree of acceptability of X → Y is greater than that of X → ¬ Y . In a nutshell: Assertability of indicative conditionals is probabilistically reliable.

  18. Theorem The following two statements are equivalent: I. Ass and P satisfy: Taut , Cons , Weak 1 , And 1 , Ref , Weak 2 , And 2 , Pres , CM , CC , Or (all applied to antecedents X with ¬ Ass ( ¬ X ) ), and High Prob . II. There is a (uniquely determined) proposition X, such that X is a non-empty P-stable proposition, and: For all propositions Z: Ass ( Z ) if and only if X ⊆ Z (and hence, B W = X). For all propositions Y, such that Y ∩ X is non-empty, for all propositions Z: Ass ( Y → Z ) if and only if Y ∩ X ⊆ Z . This coincides with our representation theorem for conditional Bel (Lecture 2!): Bel ( X ) ≈ Ass ( X ) , and Bel ( Y | X ) ≈ Ass ( X → Y ) .

  19. A simple example: Lord Russell has been murdered. There are three suspects: the butler, the cook and the gardener. The gardener does not seem a likely candidate, since he was seen pruning the roses at the time of the murder. The cook could easily have done it, but she had no apparent motive. But the butler was known to have recently discovered that his lordship had been taking liberties with the butler’s wife. Moreover, he had had every opportunity to do the deed. So it was probably the butler, but if it wasn’t the butler, then it was most likely the cook. (Bradley 2006) P ( g ∧¬ c ∧¬ b ) = 0 . 1, P ( ¬ g ∧ c ∧¬ b ) = 0 . 3, P ( ¬ g ∧¬ c ∧ b ) = 0 . 6.

  20. A simple example: Lord Russell has been murdered. There are three suspects: the butler, the cook and the gardener. The gardener does not seem a likely candidate, since he was seen pruning the roses at the time of the murder. The cook could easily have done it, but she had no apparent motive. But the butler was known to have recently discovered that his lordship had been taking liberties with the butler’s wife. Moreover, he had had every opportunity to do the deed. So it was probably the butler, but if it wasn’t the butler, then it was most likely the cook. (Bradley 2006) P ( g ∧¬ c ∧¬ b ) = 0 . 1, P ( ¬ g ∧ c ∧¬ b ) = 0 . 3, P ( ¬ g ∧¬ c ∧ b ) = 0 . 6. The candidates for B W that satisfy all of our postulates (given P ): {¬ g ∧¬ c ∧ b , ¬ g ∧ c ∧¬ b , g ∧¬ c ∧¬ b } {¬ g ∧¬ c ∧ b , ¬ g ∧ c ∧¬ b } {¬ g ∧¬ c ∧ b }

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend