chapter 3 performance measures in classification
play

Chapter 3: Performance Measures in Classification Dr. Xudong Liu - PowerPoint PPT Presentation

Chapter 3: Performance Measures in Classification Dr. Xudong Liu Assistant Professor School of Computing University of North Florida Monday, 9/16/2019 1 / 8 Confusion Matrix Each row in a confusion matrix represents an actual class. Each


  1. Chapter 3: Performance Measures in Classification Dr. Xudong Liu Assistant Professor School of Computing University of North Florida Monday, 9/16/2019 1 / 8

  2. Confusion Matrix Each row in a confusion matrix represents an actual class. Each colum in a confusion matrix represents a predicted class. Example (next slide): classify whether the digit in an image is 5. True negatives (TN): predicted negative examples that are actually negative. False positives (FP): predicted positive examples that are actually negative. False negatives (FN): predicted negative examples that are actually positive. True positives (TP): predicted positive examples that are actually positive. In Python, you may get it with confusion matrix method. Can extend to multi-class classification problems. Performance Measures 2 / 8

  3. Confusion Matrix Performance Measures 3 / 8

  4. Precision Precision is a way to look at the accuracy of the positive predictions: | TP | precision = | TP | + | FP | For the previous example, precision is 75%. But precision can be 100% if the classifier only makes one positive prediction that is correct. This would NOT be useful. Performance Measures 4 / 8

  5. Recall Recall is a way to look at the percentage of positive examples predicted correctly: | TP | recall = | TP | + | FN | For the previous example, precision is 60%. Performance Measures 5 / 8

  6. Precision/Recall Trade-off Classification models, e.g., SGDClassifier, often predict based on a computed score of a given example. If the score is below a set threshold, the example is predict negative; otherwise, positive. For the previous setting, all examples are sorted based on their scores. Performance Measures 6 / 8

  7. Precision/Recall Trade-off Performance Measures 7 / 8

  8. F 1 Score The F 1 score is the harmonic mean of precision and recall: 2 F 1 = 1 1 precision + recall Unlike regular mean, harmonic mean gives more weight to low values. Therefore, the classifier’s F 1 score is only high if both recall and precision are high. Performance Measures 8 / 8

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend