decision problems
play

Decision problems September 4, 2019 . . . . . . . . . . . - PowerPoint PPT Presentation

Decision problems September 4, 2019 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 1 / 44 Beliefs and probabilities Beliefs and


  1. Beliefs and probabilities Classification in terms of conditional probabilities 1 class 1 density class 2 density class 1 probability 0.8 0.6 0.4 0.2 0 -10 -5 0 5 10 x (a) Unequal variance Figure: The effect of changing variance and prior when we assume a normal distribution. Example 3 (Normal distribution) A simple example is when x t is normally distributed in a matter that depends on the class. Figure 2 shows the distribution of x t for two different classes, with means of − 1 and +1 respectively, for three different case. In the first case, both classes have variance . . . . . . . . . . . . . . . . . . . . of 1, and we assume the same prior probability for both . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 5 / 44 N N

  2. Beliefs and probabilities Classification in terms of conditional probabilities 1 class 1 density class 2 density class 1 probability 0.8 0.6 0.4 0.2 0 -10 -5 0 5 10 x (a) Unequal prior Figure: The effect of changing variance and prior when we assume a normal distribution. Example 3 (Normal distribution) A simple example is when x t is normally distributed in a matter that depends on the class. Figure 2 shows the distribution of x t for two different classes, with means of − 1 and +1 respectively, for three different case. In the first case, both classes have variance . . . . . . . . . . . . . . . . . . . . of 1, and we assume the same prior probability for both . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 5 / 44 N N

  3. Beliefs and probabilities Classification in terms of conditional probabilities Figure: The effect of changing variance and prior when we assume a normal distribution. Example 3 (Normal distribution) A simple example is when x t is normally distributed in a matter that depends on the class. Figure 2 shows the distribution of x t for two different classes, with means of − 1 and +1 respectively, for three different case. In the first case, both classes have variance of 1, and we assume the same prior probability for both x t | y t = 0 ∼ N ( − 1 , 1) , x t | y t = 1 ∼ N (1 , 1) x t | y t = 0 ∼ N ( − 1 , 1) , x t | y t = 1 ∼ N (1 , 1) But how can we get a probability model in the first place? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 5 / 44

  4. Beliefs and probabilities Subjective probability Subjective probability measure ξ If we think event A is more likely than B , then ξ ( A ) > ξ ( B ). Usual rules of probability apply: ξ ( A ) ∈ [0 , 1]. 1 ξ ( ∅ ) = 0. 2 If A ∩ B = ∅ , then ξ ( A ∪ B ) = ξ ( A ) + ξ ( B ). 3 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 6 / 44

  5. Beliefs and probabilities Bayesian inference illustration Use a subjective belief ξ ( µ ) on M prior Prior belief ξ ( µ ) represents our initial uncertainty. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 7 / 44

  6. Beliefs and probabilities Bayesian inference illustration Use a subjective belief ξ ( µ ) on M prior Prior belief ξ ( µ ) represents our initial uncertainty. We observe history h . evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 7 / 44

  7. Beliefs and probabilities Bayesian inference illustration Use a subjective belief ξ ( µ ) on M prior Prior belief ξ ( µ ) represents our initial uncertainty. We observe history h . Each possible µ assigns a probability P µ ( h ) to h . evidence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 7 / 44

  8. Beliefs and probabilities Bayesian inference illustration Use a subjective belief ξ ( µ ) on M prior Prior belief ξ ( µ ) represents our initial uncertainty. We observe history h . Each possible µ assigns a probability P µ ( h ) to h . We can use this to update our belief via Bayes’ theorem to obtain the posterior belief: ξ ( µ | h ) ∝ P µ ( h ) ξ ( µ ) (conclusion = evidence × prior) evidence conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 7 / 44

  9. Beliefs and probabilities Probability and Bayesian inference Some examples Example 4 John claims to be a medium. He throws a coin n times and predicts its value always correctly. Should we believe that he is a medium? µ 1 : John is a medium. µ 0 : John is not a medium. The answer depends on what we expect a medium to be able to do, and how likely we thought he’d be a medium in the first place. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 8 / 44

  10. Beliefs and probabilities Probability and Bayesian inference Bayesian inference mutually exclusive models M = { µ 1 , . . . , µ k } . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 9 / 44

  11. Beliefs and probabilities Probability and Bayesian inference Bayesian inference mutually exclusive models M = { µ 1 , . . . , µ k } . Probability model for any data x : P µ ( x ) ≡ P ( x | µ ). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 9 / 44

  12. Beliefs and probabilities Probability and Bayesian inference Bayesian inference mutually exclusive models M = { µ 1 , . . . , µ k } . Probability model for any data x : P µ ( x ) ≡ P ( x | µ ). For each model, we have a prior probability ξ ( µ ) that it is correct. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 9 / 44

  13. Beliefs and probabilities Probability and Bayesian inference Bayesian inference mutually exclusive models M = { µ 1 , . . . , µ k } . Probability model for any data x : P µ ( x ) ≡ P ( x | µ ). For each model, we have a prior probability ξ ( µ ) that it is correct. Posterior probability P ( x | µ ) ξ ( µ ) P µ ( x ) ξ ( µ ) ξ ( µ | x ) = µ ′ ∈M P ( x | µ ′ ) ξ ( µ ′ ) = µ ′ ∈M P µ ′ ( x ) ξ ( µ ′ ) . ∑ ∑ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 9 / 44

  14. Beliefs and probabilities Probability and Bayesian inference Bayesian inference mutually exclusive models M = { µ 1 , . . . , µ k } . Probability model for any data x : P µ ( x ) ≡ P ( x | µ ). For each model, we have a prior probability ξ ( µ ) that it is correct. Posterior probability P ( x | µ ) ξ ( µ ) P µ ( x ) ξ ( µ ) ξ ( µ | x ) = µ ′ ∈M P ( x | µ ′ ) ξ ( µ ′ ) = µ ′ ∈M P µ ′ ( x ) ξ ( µ ′ ) . ∑ ∑ Interpretation M : Set of all possible models that could describe the data. P µ ( x ): Probability of x under model µ . Alternative notation P ( x | µ ): Probability of x given that model µ is correct. ξ ( µ ): Our belief, before seeing the data, that µ is correct. ξ ( µ | x ): Our belief, aftering seeing the data, that µ is correct. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 9 / 44

  15. Beliefs and probabilities Probability and Bayesian inference Exercise 1 (Continued example for medium) n ∏ P µ ( x ) = P µ ( x t ) . (independence property) t =1 P µ 1 ( x t = 1) = 1 , P µ 1 ( x t = 0) = 0 . (true medium model) P µ 0 ( x t = 1) = 1 / 2 , P µ 0 ( x t = 0) = 1 / 2 . (non-medium model) Throw a coin 4 times, and have a classmate make a prediction. What your belief that your classmate is a medium? Is the prior you used reasonable? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 10 / 44

  16. Beliefs and probabilities Probability and Bayesian inference Exercise 1 (Continued example for medium) n ∏ P µ ( x ) = P µ ( x t ) . (independence property) t =1 P µ 1 ( x t = 1) = 1 , P µ 1 ( x t = 0) = 0 . (true medium model) P µ 0 ( x t = 1) = 1 / 2 , P µ 0 ( x t = 0) = 1 / 2 . (non-medium model) Throw a coin 4 times, and have a classmate make a prediction. What your belief that your classmate is a medium? Is the prior you used reasonable? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 10 / 44

  17. Beliefs and probabilities Probability and Bayesian inference Exercise 1 (Continued example for medium) n ∏ P µ ( x ) = P µ ( x t ) . (independence property) t =1 P µ 1 ( x t = 1) = 1 , P µ 1 ( x t = 0) = 0 . (true medium model) P µ 0 ( x t = 1) = 1 / 2 , P µ 0 ( x t = 0) = 1 / 2 . (non-medium model) ξ ( µ 0 ) = 1 / 2 , ξ ( µ 1 ) = 1 / 2 . (prior belief) Throw a coin 4 times, and have a classmate make a prediction. What your belief that your classmate is a medium? Is the prior you used reasonable? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 10 / 44

  18. Beliefs and probabilities Probability and Bayesian inference Exercise 1 (Continued example for medium) n ∏ P µ ( x ) = P µ ( x t ) . (independence property) t =1 P µ 1 ( x t = 1) = 1 , P µ 1 ( x t = 0) = 0 . (true medium model) P µ 0 ( x t = 1) = 1 / 2 , P µ 0 ( x t = 0) = 1 / 2 . (non-medium model) ξ ( µ 0 ) = 1 / 2 , ξ ( µ 1 ) = 1 / 2 . (prior belief) ξ ( µ 1 | x ) = P µ 1 ( x ) ξ ( µ 1 ) (posterior belief) P ξ ( x ) P ξ ( x ) ≜ P µ 1 ( x ) ξ ( µ 1 ) + P µ 0 ( x ) ξ ( µ 0 ) . (marginal distribution) Throw a coin 4 times, and have a classmate make a prediction. What your belief that your classmate is a medium? Is the prior you used reasonable? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 10 / 44

  19. Beliefs and probabilities Probability and Bayesian inference Sequential update of beliefs M T W T F S S CNN 0.5 0.6 0.7 0.9 0.5 0.3 0.1 SMHI 0.3 0.7 0.8 0.9 0.5 0.2 0.1 YR 0.6 0.9 0.8 0.5 0.4 0.1 0.1 Rain? Y Y Y N Y N N Table: Predictions by three different entities for the probability of rain on a particular day, along with whether or not it actually rained. Exercise 2 n meteorological stations { µ i | i = 1 , . . . , n } The i -th station predicts rain P µ i ( x t | x 1 , . . . , x t − 1 ). Let ξ t ( µ ) be our belief at time t . Derive the next-step belief ξ t +1 ( µ ) ≜ ξ t ( µ | y t ) in terms of the current belief ξ t . Write a python function that computes this posterior . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 11 / 44

  20. Beliefs and probabilities Probability and Bayesian inference Sequential update of beliefs M T W T F S S CNN 0.5 0.6 0.7 0.9 0.5 0.3 0.1 SMHI 0.3 0.7 0.8 0.9 0.5 0.2 0.1 YR 0.6 0.9 0.8 0.5 0.4 0.1 0.1 Rain? Y Y Y N Y N N Table: Predictions by three different entities for the probability of rain on a particular day, along with whether or not it actually rained. Exercise 2 n meteorological stations { µ i | i = 1 , . . . , n } The i -th station predicts rain P µ i ( x t | x 1 , . . . , x t − 1 ). Let ξ t ( µ ) be our belief at time t . Derive the next-step belief ξ t +1 ( µ ) ≜ ξ t ( µ | y t ) in terms of the current belief ξ t . Write a python function that computes this posterior P µ ( x t | x 1 , . . . , x t − 1 ) ξ t ( µ ) ξ t +1 ( µ ) ≜ ξ t ( µ | x t ) = ∑ µ ′ P µ ′ ( x t | x 1 , . . . , x t − 1 ) ξ t ( µ ′ ) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 11 / 44

  21. Beliefs and probabilities Probability and Bayesian inference Bayesian inference for Bernoulli distributions Estimating a coin’s bias A fair coin comes heads 50% of the time. We want to test an unknown coin, which we think may not be completely fair. 4 prior 3 2 1 0 0 0.2 0.4 0.6 0.8 1 Figure: Prior belief ξ about the coin bias θ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 12 / 44

  22. Beliefs and probabilities Probability and Bayesian inference Bayesian inference for Bernoulli distributions 4 prior 3 2 1 0 0 0.2 0.4 0.6 0.8 1 Figure: Prior belief ξ about the coin bias θ . For a sequence of throws x t ∈ { 0 , 1 } , ∏ θ x t (1 − θ ) 1 − x t = θ #Heads (1 − θ ) #Tails P θ ( x ) ∝ t . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 12 / 44

  23. Beliefs and probabilities Probability and Bayesian inference Bayesian inference for Bernoulli distributions 10 prior likelihood 8 6 4 2 0 0 0.2 0.4 0.6 0.8 1 Figure: Prior belief ξ about the coin bias θ and likelihood of θ for the data. Say we throw the coin 100 times and obtain 70 heads. Then we plot the likelihood P θ ( x ) of different models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 12 / 44

  24. Beliefs and probabilities Probability and Bayesian inference Bayesian inference for Bernoulli distributions 10 prior likelihood 8 posterior 6 4 2 0 0 0.2 0.4 0.6 0.8 1 Figure: Prior belief ξ ( θ ) about the coin bias θ , likelihood of θ for the data, and posterior belief ξ ( θ | x ) From these, we calculate a posterior distribution over the correct models. This represents our conclusion given our prior and the data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 12 / 44

  25. Beliefs and probabilities Probability and Bayesian inference Learning outcomes Understanding The axioms of probability, marginals and conditional distributions. The philosophical underpinnings of Bayesianism. The simple conjugate model for Bernoulli distributions. Skills Be able to calculate with probabilities using the marginal and conditional definitions and Bayes rule. Being able to implement a simple Bayesian inference algorithm in Python. Reflection How useful is the Bayesian representation of uncertainty? How restrictive is the need to select a prior distribution? Can you think of another way to explicitly represent uncertainty in a way that can incorporate new evidence? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 13 / 44

  26. Hierarchies of decision making problems Beliefs and probabilities 1 Hierarchies of decision making problems 2 Simple decision problems Decision rules Formalising Classification problems 3 Classification with stochastic gradient descent 4 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 14 / 44

  27. Hierarchies of decision making problems Simple decision problems Preferences Example 5 Food A McDonald’s cheeseburger B Surstromming C Oatmeal Money A 10,000,000 SEK B 10,000,000 USD C 10,000,000 BTC Entertainment A Ticket to Liseberg B Ticket to Rebstar C Ticket to Nutcracker . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 15 / 44

  28. Hierarchies of decision making problems Simple decision problems Rewards and utilities Each choice is called a reward r ∈ R . There is a utility function U : R → R , assigning values to reward. We (weakly) prefer A to B iff U ( A ) ≥ U ( B ). Exercise 3 From your individual preferences, derive a common utility function that reflects everybody’s preferences in the class for each of the three examples. Is there a simple algorithm for deciding this? Would you consider the outcome fair? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 16 / 44

  29. Hierarchies of decision making problems Simple decision problems Preferences among random outcomes Example 6 Would you rather . . . A Have 100 EUR now? B Flip a coin, and get 200 EUR if it comes heads? Risk and monetary rewards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 17 / 44

  30. Hierarchies of decision making problems Simple decision problems Preferences among random outcomes Example 6 Would you rather . . . A Have 100 EUR now? B Flip a coin, and get 200 EUR if it comes heads? The expected utility hypothesis Rational decision makers prefer choice A to B if E ( U | A ) ≥ E ( U | B ) , where the expected utility is ∑ E ( U | A ) = U ( r ) P ( r | A ) . r In the above example, r ∈ { 0 , 100 , 200 } and U ( r ) is increasing, and the coin is fair. Risk and monetary rewards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 17 / 44

  31. Hierarchies of decision making problems Simple decision problems Preferences among random outcomes Example 6 Would you rather . . . A Have 100 EUR now? B Flip a coin, and get 200 EUR if it comes heads? The expected utility hypothesis Rational decision makers prefer choice A to B if E ( U | A ) ≥ E ( U | B ) , where the expected utility is ∑ E ( U | A ) = U ( r ) P ( r | A ) . r In the above example, r ∈ { 0 , 100 , 200 } and U ( r ) is increasing, and the coin is fair. Risk and monetary rewards If U is convex, we are risk-seeking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 17 / 44

  32. Hierarchies of decision making problems Simple decision problems Preferences among random outcomes Example 6 Would you rather . . . A Have 100 EUR now? B Flip a coin, and get 200 EUR if it comes heads? The expected utility hypothesis Rational decision makers prefer choice A to B if E ( U | A ) ≥ E ( U | B ) , where the expected utility is ∑ E ( U | A ) = U ( r ) P ( r | A ) . r In the above example, r ∈ { 0 , 100 , 200 } and U ( r ) is increasing, and the coin is fair. Risk and monetary rewards If U is convex, we are risk-seeking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . If U is concave, we are risk-averse. Decision problems September 4, 2019 17 / 44

  33. Hierarchies of decision making problems Simple decision problems Preferences among random outcomes Example 6 Would you rather . . . A Have 100 EUR now? B Flip a coin, and get 200 EUR if it comes heads? The expected utility hypothesis Rational decision makers prefer choice A to B if E ( U | A ) ≥ E ( U | B ) , where the expected utility is ∑ E ( U | A ) = U ( r ) P ( r | A ) . r In the above example, r ∈ { 0 , 100 , 200 } and U ( r ) is increasing, and the coin is fair. Risk and monetary rewards If U is convex, we are risk-seeking. If U is linear, we are risk neutral. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . If U is concave, we are risk-averse. Decision problems September 4, 2019 17 / 44

  34. Hierarchies of decision making problems Simple decision problems Uncertain rewards Decisions a ∈ A Each choice is called a reward r ∈ R . There is a utility function U : R → R , assigning values to reward. We (weakly) prefer A to B iff U ( A ) ≥ U ( B ). Example 7 ρ ( ω, a ) a 1 a 2 ω 1 dry, carrying umbrella wet You are going to work, and it might rain. ω 2 dry, carrying umbrella dry What do you do? U [ ρ ( ω, a )] a 1 a 2 a 1 : Take the umbrella. ω 1 0 -10 a 2 : Risk it! ω 2 0 1 ω 1 : rain Table: Rewards and utilities. ω 2 : dry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 18 / 44

  35. Hierarchies of decision making problems Simple decision problems Uncertain rewards Decisions a ∈ A Each choice is called a reward r ∈ R . There is a utility function U : R → R , assigning values to reward. We (weakly) prefer A to B iff U ( A ) ≥ U ( B ). Example 7 ρ ( ω, a ) a 1 a 2 ω 1 dry, carrying umbrella wet You are going to work, and it might rain. ω 2 dry, carrying umbrella dry What do you do? U [ ρ ( ω, a )] a 1 a 2 a 1 : Take the umbrella. ω 1 0 -10 a 2 : Risk it! ω 2 0 1 ω 1 : rain Table: Rewards and utilities. ω 2 : dry max a min ω U = 0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 18 / 44

  36. Hierarchies of decision making problems Simple decision problems Uncertain rewards Decisions a ∈ A Each choice is called a reward r ∈ R . There is a utility function U : R → R , assigning values to reward. We (weakly) prefer A to B iff U ( A ) ≥ U ( B ). Example 7 ρ ( ω, a ) a 1 a 2 ω 1 dry, carrying umbrella wet You are going to work, and it might rain. ω 2 dry, carrying umbrella dry What do you do? U [ ρ ( ω, a )] a 1 a 2 a 1 : Take the umbrella. ω 1 0 -10 a 2 : Risk it! ω 2 0 1 ω 1 : rain Table: Rewards and utilities. ω 2 : dry max a min ω U = 0 min ω max a U = 0 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 18 / 44

  37. Hierarchies of decision making problems Simple decision problems Expected utility ∑ E ( U | a ) = U [ ρ ( ω, a )] P ( ω | a ) r Example 8 You are going to work, and it might rain. The forecast said that the probability of rain ( ω 1 ) was 20%. What do you do? a 1 : Take the umbrella. a 2 : Risk it! ρ ( ω, a ) a 1 a 2 ω 1 dry, carrying umbrella wet ω 2 dry, carrying umbrella dry U [ ρ ( ω, a )] a 1 a 2 ω 1 0 -10 ω 2 0 1 E P ( U | a ) 0 -1.2 Table: Rewards, utilities, expected utility for 20% probability of rain. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 19 / 44

  38. Hierarchies of decision making problems Decision rules Bayes decision rules Consider the case where outcomes are independent of decisions: ∑ U ( ξ, a ) ≜ U ( µ, a ) ξ ( µ ) µ This corresponds e.g. to the case where ξ ( µ ) is the belief about an unknown world. Definition 9 (Bayes utility) The maximising decision for ξ has an expected utility equal to: U ∗ ( ξ ) ≜ max a ∈A U ( ξ, a ) . (2.1) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 20 / 44

  39. Hierarchies of decision making problems Decision rules The n -meteorologists problem Exercise 4 Meteorological models M = { µ 1 , . . . , µ n } Rain predictions at time t : p t ,µ ≜ P µ ( x t = rain ). Prior probability ξ ( µ ) = 1 / n for each model. Should we take the umbrella? M T W T F S S CNN 0.5 0.6 0.7 0.9 0.5 0.3 0.1 SMHI 0.3 0.7 0.8 0.9 0.5 0.2 0.1 YR 0.6 0.9 0.8 0.5 0.4 0.1 0.1 Rain? Y Y Y N Y N N Table: Predictions by three different entities for the probability of rain on a particular day, along with whether or not it actually rained. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 21 / 44

  40. Hierarchies of decision making problems Decision rules The n -meteorologists problem Exercise 4 M T W T F S S CNN 0.5 0.6 0.7 0.9 0.5 0.3 0.1 SMHI 0.3 0.7 0.8 0.9 0.5 0.2 0.1 YR 0.6 0.9 0.8 0.5 0.4 0.1 0.1 Rain? Y Y Y N Y N N Table: Predictions by three different entities for the probability of rain on a particular day, along with whether or not it actually rained. What is your belief about the quality of each meteorologist after each day? 1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 21 / 44

  41. Hierarchies of decision making problems Decision rules The n -meteorologists problem Exercise 4 M T W T F S S CNN 0.5 0.6 0.7 0.9 0.5 0.3 0.1 SMHI 0.3 0.7 0.8 0.9 0.5 0.2 0.1 YR 0.6 0.9 0.8 0.5 0.4 0.1 0.1 Rain? Y Y Y N Y N N Table: Predictions by three different entities for the probability of rain on a particular day, along with whether or not it actually rained. What is your belief about the quality of each meteorologist after each day? 1 What is your belief about the probability of rain each day? 2 ∑ P ξ ( x t = rain | x 1 , x 2 , . . . x t − 1 ) = P µ ( x t = rain | x 1 , x 2 , . . . x t − 1 ) ξ ( µ | x 1 , x 2 , . . . x t − 1 ) µ ∈M . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 21 / 44

  42. Hierarchies of decision making problems Decision rules The n -meteorologists problem Exercise 4 M T W T F S S CNN 0.5 0.6 0.7 0.9 0.5 0.3 0.1 SMHI 0.3 0.7 0.8 0.9 0.5 0.2 0.1 YR 0.6 0.9 0.8 0.5 0.4 0.1 0.1 Rain? Y Y Y N Y N N Table: Predictions by three different entities for the probability of rain on a particular day, along with whether or not it actually rained. What is your belief about the quality of each meteorologist after each day? 1 What is your belief about the probability of rain each day? 2 ∑ P ξ ( x t = rain | x 1 , x 2 , . . . x t − 1 ) = P µ ( x t = rain | x 1 , x 2 , . . . x t − 1 ) ξ ( µ | x 1 , x 2 , . . . x t − 1 ) µ ∈M Assume you can decide whether or not to go running each day. If you go running 3 and it does not rain, your utility is 1. If it rains, it’s -10. If you don’t go running, your utility is 0. What is the decision maximising utility in expectation (with respect to the posterior) each day? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 21 / 44

  43. Formalising Classification problems Deciding a class given a model Features x t ∈ X . Label y t ∈ Y . Decisions a t ∈ A . Decision rule π ( a t | x t ) assigns probabilities to actions. Standard classification problem A = Y , U ( a , y ) = I { a = y } Exercise 5 If we have a model P µ ( y t | x t ), and a suitable U , what is the optimal decision to make? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 22 / 44

  44. Formalising Classification problems Deciding a class given a model Features x t ∈ X . Label y t ∈ Y . Decisions a t ∈ A . Decision rule π ( a t | x t ) assigns probabilities to actions. Standard classification problem A = Y , U ( a , y ) = I { a = y } Exercise 5 If we have a model P µ ( y t | x t ), and a suitable U , what is the optimal decision to make? ∑ a t ∈ arg max P µ ( y t = y | x t ) U ( a , y ) a ∈A y . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 22 / 44

  45. Formalising Classification problems Deciding a class given a model Features x t ∈ X . Label y t ∈ Y . Decisions a t ∈ A . Decision rule π ( a t | x t ) assigns probabilities to actions. Standard classification problem A = Y , U ( a , y ) = I { a = y } Exercise 5 If we have a model P µ ( y t | x t ), and a suitable U , what is the optimal decision to make? ∑ a t ∈ arg max P µ ( y t = y | x t ) U ( a , y ) a ∈A y For standard classification, a t ∈ arg max P µ ( y t = a | x t ) a ∈A . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 22 / 44

  46. Formalising Classification problems Deciding the class given a model family Training data D T = { ( x i , y i ) | i = 1 , . . . , T } Models { P µ | µ ∈ M} . Prior ξ on M . Posterior over classification models P µ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ) ξ ( µ | D T ) = ∑ µ ′ ∈M P µ ′ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ′ ) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 23 / 44

  47. Formalising Classification problems Deciding the class given a model family Training data D T = { ( x i , y i ) | i = 1 , . . . , T } Models { P µ | µ ∈ M} . Prior ξ on M . Posterior over classification models P µ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ) ξ ( µ | D T ) = ∑ µ ′ ∈M P µ ′ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ′ ) If not dealing with time-series data, we assume independence between x t : T ∏ P µ ( y 1 , . . . , y T | x 1 , . . . , x T ) = P µ ( y i | x i ) i =1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 23 / 44

  48. Formalising Classification problems Deciding the class given a model family Training data D T = { ( x i , y i ) | i = 1 , . . . , T } Models { P µ | µ ∈ M} . Prior ξ on M . Posterior over classification models P µ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ) ξ ( µ | D T ) = ∑ µ ′ ∈M P µ ′ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ′ ) The Bayes rule for maximising E ξ ( U | a , x t , D T ) The decision rule simply chooses the action: ∑ ∑ a t ∈ arg max P µ ( y t = y | x t ) ξ ( µ | D T ) U ( a , y ) (3.1) a ∈A y µ ∈M . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 23 / 44

  49. Formalising Classification problems Deciding the class given a model family Training data D T = { ( x i , y i ) | i = 1 , . . . , T } Models { P µ | µ ∈ M} . Prior ξ on M . Posterior over classification models P µ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ) ξ ( µ | D T ) = ∑ µ ′ ∈M P µ ′ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ′ ) The Bayes rule for maximising E ξ ( U | a , x t , D T ) The decision rule simply chooses the action: ∑ ∑ a t ∈ arg max P µ ( y t = y | x t ) ξ ( µ | D T ) U ( a , y ) (3.1) a ∈A y µ ∈M We can rewrite this by calculating the posterior marginal marginal label probability ∑ P ξ | D T ( y t | x t ) ≜ P ξ ( y t | x t , D T ) = P µ ( y t | x t ) ξ ( µ | D T ) . µ ∈M . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 23 / 44

  50. Formalising Classification problems Deciding the class given a model family Training data D T = { ( x i , y i ) | i = 1 , . . . , T } Models { P µ | µ ∈ M} . Prior ξ on M . Posterior over classification models P µ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ) ξ ( µ | D T ) = ∑ µ ′ ∈M P µ ′ ( y 1 , . . . , y T | x 1 , . . . , x T ) ξ ( µ ′ ) The Bayes rule for maximising E ξ ( U | a , x t , D T ) The decision rule simply chooses the action: ∑ ∑ a t ∈ arg max P µ ( y t = y | x t ) ξ ( µ | D T ) U ( a , y ) (3.1) a ∈A y µ ∈M ∑ = arg max P ξ | D T ( y t | x t ) U ( a , y ) (3.2) a ∈A y We can rewrite this by calculating the posterior marginal marginal label probability ∑ P ξ | D T ( y t | x t ) ≜ P ξ ( y t | x t , D T ) = P µ ( y t | x t ) ξ ( µ | D T ) . µ ∈M . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 23 / 44

  51. Formalising Classification problems Approximating the model Full Bayesian approach for infinite M Here ξ can be a probability density function and ∫ ξ ( µ | D T ) = P µ ( D T ) ξ ( µ ) / P ξ ( D T ) , P ξ ( D T ) = P µ ( D T ) ξ ( µ ) d , M can be hard to calculate. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 24 / 44

  52. Formalising Classification problems Approximating the model Full Bayesian approach for infinite M Here ξ can be a probability density function and ∫ ξ ( µ | D T ) = P µ ( D T ) ξ ( µ ) / P ξ ( D T ) , P ξ ( D T ) = P µ ( D T ) ξ ( µ ) d , M can be hard to calculate. Maximum a posteriori model We only choose a single model through the following optimisation: µ MAP ( ξ, D T ) = arg max P µ ( D T ) ξ ( µ ) µ ∈M . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 24 / 44

  53. Formalising Classification problems Approximating the model Full Bayesian approach for infinite M Here ξ can be a probability density function and ∫ ξ ( µ | D T ) = P µ ( D T ) ξ ( µ ) / P ξ ( D T ) , P ξ ( D T ) = P µ ( D T ) ξ ( µ ) d , M can be hard to calculate. Maximum a posteriori model We only choose a single model through the following optimisation: goodness of fit � �� � µ MAP ( ξ, D T ) = arg max ln P µ ( D T ) + ln ξ ( µ ) . µ ∈M � �� � regulariser . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 24 / 44

  54. Formalising Classification problems Learning outcomes Understanding Preferences, utilities and the expected utility principle. Hypothesis testing and classification as decision problems. How to interpret p -values Bayesian tests. The MAP approximation to full Bayesian inference. Skills Being able to implement an optimal decision rule for a given utility and probability. Being able to construct a simple null hypothesis test. Reflection When would expected utility maximisation not be a good idea? What does a p value represent when you see it in a paper? Can we prevent high false discovery rates when using p values? When is the MAP approximation good? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 25 / 44

  55. Formalising Classification problems Statistical testing Simple hypothesis testing The simple hypothesis test as a decision problem M = { µ 0 , µ 1 } a 0 : Accept model µ 0 a 1 : Accept model µ 1 µ 0 µ 1 U a 0 1 0 a 1 0 1 Table: Example utility function for simple hypothesis tests. Example 10 (Continuation of the medium example) µ 1 : that John is a medium. µ 0 : that John is not a medium. E ξ ( U | a 0 ) = 1 × ξ ( µ 0 | x ) + 0 × ξ ( µ 1 | x ) , E ξ ( U | a 1 ) = 0 × ξ ( µ 0 | x ) + 1 × ξ ( µ 1 | x ) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 26 / 44

  56. Formalising Classification problems Statistical testing Null hypothesis test Many times, there is only one model under consideration, µ 0 , the so-called null hypothesis. The null hypothesis test as a decision problem a 0 : Accept model µ 0 a 1 : Reject model µ 0 Example 11 Construction of the test for the medium . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 27 / 44

  57. Formalising Classification problems Statistical testing Null hypothesis test Many times, there is only one model under consideration, µ 0 , the so-called null hypothesis. The null hypothesis test as a decision problem a 0 : Accept model µ 0 a 1 : Reject model µ 0 Example 11 Construction of the test for the medium µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 27 / 44

  58. Formalising Classification problems Statistical testing Null hypothesis test Many times, there is only one model under consideration, µ 0 , the so-called null hypothesis. The null hypothesis test as a decision problem a 0 : Accept model µ 0 a 1 : Reject model µ 0 Example 11 Construction of the test for the medium µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. We need to design a policy π ( a | x ) that accepts or rejects depending on the data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 27 / 44

  59. Formalising Classification problems Statistical testing Null hypothesis test Many times, there is only one model under consideration, µ 0 , the so-called null hypothesis. The null hypothesis test as a decision problem a 0 : Accept model µ 0 a 1 : Reject model µ 0 Example 11 Construction of the test for the medium µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. We need to design a policy π ( a | x ) that accepts or rejects depending on the data. Since there is no alternative model, we can only construct this policy according to its properties when µ 0 is true. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 27 / 44

  60. Formalising Classification problems Statistical testing Null hypothesis test Many times, there is only one model under consideration, µ 0 , the so-called null hypothesis. The null hypothesis test as a decision problem a 0 : Accept model µ 0 a 1 : Reject model µ 0 Example 11 Construction of the test for the medium µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. We need to design a policy π ( a | x ) that accepts or rejects depending on the data. Since there is no alternative model, we can only construct this policy according to its properties when µ 0 is true. In particular, we can fix a policy that only chooses a 1 when µ 0 is true a proportion δ of the time. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 27 / 44

  61. Formalising Classification problems Statistical testing Null hypothesis test Many times, there is only one model under consideration, µ 0 , the so-called null hypothesis. The null hypothesis test as a decision problem a 0 : Accept model µ 0 a 1 : Reject model µ 0 Example 11 Construction of the test for the medium µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. We need to design a policy π ( a | x ) that accepts or rejects depending on the data. Since there is no alternative model, we can only construct this policy according to its properties when µ 0 is true. In particular, we can fix a policy that only chooses a 1 when µ 0 is true a proportion δ of the time. This can be done by construcing a threshold test from the inverse-CDF. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 27 / 44

  62. Formalising Classification problems Statistical testing Using p -values to construct statistical tests Definition 12 (Null statistical test) The statistic f : X → [0 , 1] is designed to have the property: P µ 0 ( { x | f ( x ) ≤ δ } ) = δ. If our decision rule is: { a 0 , f ( x ) ≤ δ π ( a | x ) = a 1 , f ( x ) > δ, the probability of rejecting the null hypothesis when it is true is exactly δ . The value of the statistic f ( x ), otherwise known as the p -value, is uninformative. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 28 / 44

  63. Formalising Classification problems Statistical testing Issues with p -values They only measure quality of fit on the data. Not robust to model misspecification. They ignore effect sizes. They do not consider prior information. They do not represent the probability of having made an error. The null-rejection error probability is the same irrespective of the amount of data (by design). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 29 / 44

  64. Formalising Classification problems Statistical testing p -values for the medium example . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 30 / 44

  65. Formalising Classification problems Statistical testing p -values for the medium example µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 30 / 44

  66. Formalising Classification problems Statistical testing p -values for the medium example µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. CDF: P µ 0 ( N ≤ n | K = 100) Probability of less than N successes 1 0 . 8 0 . 6 0 . 4 0 . 2 0 0 20 40 60 80 100 Number of successes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 30 / 44

  67. Formalising Classification problems Statistical testing p -values for the medium example µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. CDF: P µ 0 ( N ≤ n | K = 100) ICDF: the number of successes that will happen with probability at least δ Probability of less than N successes 100 1 Number of successes 80 0 . 8 60 0 . 6 40 0 . 4 20 0 . 2 0 0 0 . 2 0 . 4 0 . 6 0 . 8 1 0 0 20 40 60 80 100 Probability of less than N successes Number of successes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 30 / 44

  68. Formalising Classification problems Statistical testing p -values for the medium example µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. CDF: P µ 0 ( N ≤ n | K = 100) ICDF: the number of successes that will happen with probability at least δ e.g. we’ll get at most 50 successes a proportion δ = 1 / 2 of the time. Probability of less than N successes 100 1 Number of successes 80 0 . 8 60 0 . 6 40 0 . 4 20 0 . 2 0 0 0 . 2 0 . 4 0 . 6 0 . 8 1 0 0 20 40 60 80 100 Probability of less than N successes Number of successes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 30 / 44

  69. Formalising Classification problems Statistical testing p -values for the medium example µ 0 is simply the Bernoulli (1 / 2) model: responses are by chance. CDF: P µ 0 ( N ≤ n | K = 100) ICDF: the number of successes that will happen with probability at least δ e.g. we’ll get at most 50 successes a proportion δ = 1 / 2 of the time. Using the (inverse) CDF we can construct a policy π that selects a 1 when µ 0 is true only a δ portion of the time, for any choice of δ . Probability of less than N successes 100 1 Number of successes 80 0 . 8 60 0 . 6 40 0 . 4 20 0 . 2 0 0 0 . 2 0 . 4 0 . 6 0 . 8 1 0 0 20 40 60 80 100 Probability of less than N successes Number of successes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 30 / 44

  70. Formalising Classification problems Statistical testing Building a test The test statistic We want the test to reflect that we don’t have a significant number of failures. n ∑ f ( x ) = 1 − binocdf ( x t , n , 0 . 5) t =1 What f ( x ) is and is not It is a statistic which is ≤ δ a δ portion of the time when µ 0 is true. It is not the probability of observing x under µ 0 . It is not the probability of µ 0 given x . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 31 / 44

  71. Formalising Classification problems Statistical testing Exercise 6 Let us throw a coin 8 times, and try and predict the outcome. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 32 / 44

  72. Formalising Classification problems Statistical testing Exercise 6 Let us throw a coin 8 times, and try and predict the outcome. Select a p -value threshold so that δ = 0 . 05. For 8 throws, this corresponds to . The rejection threshold as data increases 1 Success rate 0 . 8 0 . 6 10 0 10 1 10 2 10 3 Amount of throws Figure: Here we see how the rejection threshold, in terms of the success rate, changes with the . . . . . . . . . . . . . . . . . . . . number of throws to achieve an error rate of δ = 0 . 05. . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 32 / 44

  73. Formalising Classification problems Statistical testing Exercise 6 Let us throw a coin 8 times, and try and predict the outcome. Select a p -value threshold so that δ = 0 . 05. For 8 throws, this corresponds to > 6 successes or ≥ 87 . 5% success rate. Let’s calculate the p -value for each one of you The rejection threshold as data increases 1 Success rate 0 . 8 0 . 6 10 0 10 1 10 2 10 3 Amount of throws Figure: Here we see how the rejection threshold, in terms of the success rate, changes with the . . . . . . . . . . . . . . . . . . . . number of throws to achieve an error rate of δ = 0 . 05. . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 32 / 44

  74. Formalising Classification problems Statistical testing Exercise 6 Let us throw a coin 8 times, and try and predict the outcome. Select a p -value threshold so that δ = 0 . 05. For 8 throws, this corresponds to > 6 successes or ≥ 87 . 5% success rate. Let’s calculate the p -value for each one of you What is the rejection performance of the test? How often we reject the null hypothesis 1 null-distributed 0 . 8 other distribution 0 . 6 0 . 4 0 . 2 0 0 200 400 600 800 1 , 000 Figure: Here we see the rejection rate of the null hypothesis ( µ 0 ) for two cases. Firstly, for the case when µ 0 is true. Secondly, when the data is generated from Bernoulli (0 . 55). . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 32 / 44

  75. Formalising Classification problems Statistical testing Statistical power and false discovery. Beyond not rejecting the null when it’s true, we also want: High power: Rejecting the null when it is false. Low false discovery rate: Accepting the null when it is true. Power The power depends on what hypothesis we use as an alternative. False discovery rate False discovery depends on how likely it is a priori that the null is false. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 33 / 44

  76. Formalising Classification problems Statistical testing The Bayesian version of the test Example 13 Set U ( a i , µ j ) = I { i = j } . 1 Set ξ ( µ i ) = 1 / 2. 2 µ 0 : Bernoulli (1 / 2). 3 µ 1 : Bernoulli ( θ ), θ ∼ Unif ([0 , 1]). 4 Calculate ξ ( µ | x ). 5 Choose a i , where i = arg max j ξ ( µ j | x ). 6 Bayesian model averaging for the alternative model µ 1 ∫ P µ 1 ( x ) = B θ ( x ) d β ( θ ) (3.3) Θ P µ 0 ( x ) ξ ( µ 0 ) ξ ( µ 0 | x ) = (3.4) P µ 0 ( x ) ξ ( µ 0 ) + P µ 1 ( x ) ξ ( µ 1 ) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 34 / 44

  77. Formalising Classification problems Statistical testing Posterior probability of null hypothesis 1 0 . 8 0 . 6 0 . 4 null-distributed other-distributed 0 . 2 0 200 400 600 800 1 , 000 Figure: Here we see the convergence of the posterior probability. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 35 / 44

  78. Formalising Classification problems Statistical testing Rejection of null hypothesis for Bernoulli(0.5) 1 null test Bayes test 0 . 8 0 . 6 0 . 4 0 . 2 0 0 200 400 600 800 1 , 000 Figure: Comparison of the rejection probability for the null and the Bayesian test when µ 0 is true. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 35 / 44

  79. Formalising Classification problems Statistical testing Rejection of null hypothesis for Bernoulli(0.55) 1 null test Bayes test 0 . 8 0 . 6 0 . 4 0 . 2 0 0 200 400 600 800 1 , 000 Figure: Comparison of the rejection probability for the null and the Bayesian test when µ 1 is true. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 35 / 44

  80. Formalising Classification problems Statistical testing Further reading Points of significance (Nature Methods) Importance of being uncertain https://www.nature.com/articles/nmeth.2613 Error bars https://www.nature.com/articles/nmeth.2659 P values and the search for significance https://www.nature.com/articles/nmeth.4120 Bayes’ theorem https://www.nature.com/articles/nmeth.3335 Sampling distributions and the bootstrap https://www.nature.com/articles/nmeth.3414 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 36 / 44

  81. Classification with stochastic gradient descent Beliefs and probabilities 1 Hierarchies of decision making problems 2 Formalising Classification problems 3 Classification with stochastic gradient descent 4 Neural network models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 37 / 44

  82. Classification with stochastic gradient descent Classification as an optimisation problem. The µ -optimal classifier f ( π θ , µ, U ) ≜ E π θ max θ ∈ Θ f ( π θ , µ, U ) , µ ( U ) (4.1) ∑ U ( a , y ) π θ ( a | x ) P µ ( y | x ) P µ ( x ) f ( π θ , µ, U ) = (4.2) x , y , a T ∑ ∑ ( x t , y t ) T ≈ U ( a t , y t ) π θ ( a t | x t ) , t =1 ∼ P µ . (4.3) t =1 a t . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 38 / 44

  83. Classification with stochastic gradient descent Bayesian inference for Bernoulli distributions Estimating a coin’s bias A fair coin comes heads 50% of the time. We want to test an unknown coin, which we think may not be completely fair. 4 prior 3 2 1 0 0 0.2 0.4 0.6 0.8 1 Figure: Prior belief ξ about the coin bias θ . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 39 / 44

  84. Classification with stochastic gradient descent Bayesian inference for Bernoulli distributions 4 prior 3 2 1 0 0 0.2 0.4 0.6 0.8 1 Figure: Prior belief ξ about the coin bias θ . For a sequence of throws x t ∈ { 0 , 1 } , ∏ θ x t (1 − θ ) 1 − x t = θ #Heads (1 − θ ) #Tails P θ ( x ) ∝ t . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 39 / 44

  85. Classification with stochastic gradient descent Bayesian inference for Bernoulli distributions 10 prior likelihood 8 6 4 2 0 0 0.2 0.4 0.6 0.8 1 Figure: Prior belief ξ about the coin bias θ and likelihood of θ for the data. Say we throw the coin 100 times and obtain 70 heads. Then we plot the likelihood P θ ( x ) of different models. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Decision problems September 4, 2019 39 / 44

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend