statistical machine learning
play

Statistical Machine Learning Lecture 05: Bayesian Decision Theory - PowerPoint PPT Presentation

Statistical Machine Learning Lecture 05: Bayesian Decision Theory Kristian Kersting TU Darmstadt Summer Term 2020 K. Kersting based on Slides from J. Peters Statistical Machine Learning Summer Term 2020 1 / 36 Todays Objectives Make


  1. Statistical Machine Learning Lecture 05: Bayesian Decision Theory Kristian Kersting TU Darmstadt Summer Term 2020 K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 1 / 36

  2. Today’s Objectives Make you understand how to do an optimal decision! Covered Topics: Bayesian Optimal Decisions Classification from a Bayesian point of view Risk-based Classification K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 2 / 36

  3. Outline 1. Bayesian Decision Theory 2. Risk Minimization 3. Wrap-Up K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 3 / 36

  4. 1. Bayesian Decision Theory Outline 1. Bayesian Decision Theory 2. Risk Minimization 3. Wrap-Up K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 4 / 36

  5. 1. Bayesian Decision Theory Statistical Methods Statistical methods in machine learning all have in common that they assume that the process that “generates” the data is governed by the rules of probability The data is understood to be a set of random samples from some underlying probability distribution Today will be all about probabilities. But in future lectures, the use of probability will sometimes be much less explicit Nonetheless, the basic assumption about how the data is generated is always there, even if you don’t see a single probability distribution anywhere K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 5 / 36

  6. 1. Bayesian Decision Theory Character Recognition Goal : classify a new letter so that the probability of a wrong classification is minimized K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 6 / 36

  7. 1. Bayesian Decision Theory Class conditional probabilities Class conditional probabilities Probability of making an observation x knowing that it comes from some class C k Here x is often a feature vector, which measures/describes properties of the data. E.g.: number of black pixels, height-width ratio, ... K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 7 / 36

  8. 1. Bayesian Decision Theory Class conditional probabilities Example How do we decide which class the data point belongs to? Here, we should decide for class a K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 8 / 36

  9. 1. Bayesian Decision Theory Class conditional probabilities Example How do we decide which class the data point belongs to? Since p ( x | a ) is a lot smaller than p ( x | b ) we should now decide for class b K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 9 / 36

  10. 1. Bayesian Decision Theory Class conditional probabilities Example How do we decide which class the data point belongs to? K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 10 / 36

  11. 1. Bayesian Decision Theory Class priors The a priori probability of a data point belonging to a particular class is called the class prior Example: abaaababaaaabbaaaaaa What are p ( a ) and p ( b ) ? C 1 = a p ( C 1 ) = 0 . 75 p ( C 2 ) = 0 . 25 C 2 = b � p ( C k ) = 1 k K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 11 / 36

  12. 1. Bayesian Decision Theory Back to our problem... Example How do we decide which class the data point belongs to? If p ( a ) = 0 . 75 and p ( b ) = 0 . 25, we should decide for class a K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 12 / 36

  13. 1. Bayesian Decision Theory Bayesian Decision Theory Bayes Theorem lets us formalize the previous intuitive decision We want to find the a-posteriori probability (posterior) of the class C k given the observation (feature) x p ( C k | x ) = p ( x | C k ) p ( C k ) p ( x | C k ) p ( C k ) � � � � = � p ( x ) j p x | C j p C j class prior: p ( C k ) class-conditional probability (likelihood): p ( x | C k ) class posterior: p ( C k | x ) normalization term: p ( x ) K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 13 / 36

  14. 1. Bayesian Decision Theory Bayesian Decision Theory K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 14 / 36

  15. 1. Bayesian Decision Theory Bayesian Decision Theory Why is it called this way? To some extent, because it involves applying Bayes’ rule But this is not the whole story... The real reason is that it is built on so-called Bayesian probabilities K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 15 / 36

  16. 1. Bayesian Decision Theory Bayesian Probabilities Probability is not just interpreted as a frequency of a certain event happening Rather, it is seen as a degree of belief in an outcome Only this allows us to assert a prior belief in a data point coming from a certain class Even though this might seem easy to accept to you now, this interpretation was quite contentious in statistics for a long time K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 16 / 36

  17. 1. Bayesian Decision Theory Bayesian Decision Theory Goal: Minimize the misclassification rate (the probability of classifying wrongly) x 0 b x p ( x, C 1 ) p ( x, C 2 ) x R 1 R 2 p ( error ) = p ( x ∈ R 1 , C 2 ) + p ( x ∈ R 2 , C 1 ) � � = p ( x , C 2 ) d x + p ( x , C 1 ) d x R 1 R 2 � � = p ( x | C 2 ) p ( C 2 ) d x + p ( x | C 1 ) p ( C 1 ) d x R 1 R 2 K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 17 / 36

  18. 1. Bayesian Decision Theory Bayesian Decision Theory Decision rule: decide C 1 if p ( C 1 | x ) > p ( C 2 | x ) Equivalent to p ( x | C 1 ) p ( C 1 ) p ( x | C 2 ) p ( C 2 ) > p ( x ) p ( x ) p ( x | C 1 ) p ( C 1 ) > p ( x | C 2 ) p ( C 2 ) p ( x | C 1 ) p ( C 2 ) > p ( x | C 2 ) p ( C 1 ) A classifier obeying this rule is called a Bayes Optimal Classifier K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 18 / 36

  19. 1. Bayesian Decision Theory Bayesian Decision Theory p ( x | C 1 ) p ( C 2 ) > p ( x | C 2 ) p ( C 1 ) Special cases If p ( x | C 1 ) = p ( x | C 2 ) , then use p ( C 1 ) > p ( C 2 ) If p ( C 1 ) = p ( C 2 ) , then use p ( x | C 1 ) > p ( x | C 2 ) K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 19 / 36

  20. 1. Bayesian Decision Theory More than two Classes Generalization to more than 2 classes: Decide for class k iff it has the highest a-posteriori probability � � p ( C k | x ) > p C j | x ∀ j � = k Equivalent to p ( x | C k ) p ( C k ) > p ( x | C j ) p ( C j ) ∀ j � = k p ( x | C k ) p ( C j ) > ∀ j � = k p ( x | C j ) p ( C k ) K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 20 / 36

  21. 1. Bayesian Decision Theory More than two Classes Decision regions: R 1 , R 2 , R 3 , . . . K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 21 / 36

  22. 1. Bayesian Decision Theory High Dimensional Features So far we have only considered one-dimensional features, i.e., x ∈ R We can use more features and generalize to an arbitrary D -dimensional feature space, i.e., x ∈ R D For instance, in the salmon vs. sea-bass classification task � x 1 � ⊺ ∈ R 2 x = x 2 Where x 1 is the width, and x 2 is the lightness The decision boundary we devised still applies to x ∈ R D . We just need to use multivariate class-conditional densities p ( x | C k ) K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 22 / 36

  23. 1. Bayesian Decision Theory Dummy Classes There are also applications, where it may be advantageous to have a dummy class denoted “don’t know” or “don’t care” Also called a reject option Not a common case though and we will not cover this in this class K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 23 / 36

  24. 2. Risk Minimization Outline 1. Bayesian Decision Theory 2. Risk Minimization 3. Wrap-Up K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 24 / 36

  25. 2. Risk Minimization 2. Risk Minimization So far, we have tried to minimize the misclassification rate There are many cases when not every misclassification is equally bad Smoke detector If there is a fire, we need to be very sure that we classify it as such If there is no fire, it is ok to occasionally have a false alarm Medical diagnosis If the patient is sick, we need to be very sure that we report them as sick If they are healthy, it is ok to classify them as sick and order further testing that may help clarifying this up K. Kersting based on Slides from J. Peters · Statistical Machine Learning · Summer Term 2020 25 / 36

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend