data classification
play

Data Classification Linear Classifier II Latent Differential - PowerPoint PPT Presentation

Data Classification Linear Classifier II Latent Differential Analysis Mean Classification Memory If youre here, you are RED If youre here, you are BLUE 2 Back Linear Classifier A classifier that assigns a class to a new point


  1. Data Classification Linear Classifier II Latent Differential Analysis

  2. Mean Classification Memory If you’re here, you are RED If you’re here, you are BLUE 2 – Back

  3. Linear Classifier A classifier that assigns a class to a new point based on a separation hyperplane is called a linear classifier . The criterion for a linear classifier can be written as vector product, ie., there is a vector w and a number c such that a new data vector x is classified as being in group one exactly if

  4. Limitations of Mean Classifier Memory LOO accuracy: 66.6 % 2 – Back

  5. Linear Classifier Works Memory LOO performance: 100 % 2 – Back

  6. Linear Discriminant Analysis Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

  7. Linear Discriminant Analysis Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

  8. Linear Discriminant Analysis Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

  9. Linear Discriminant Analysis Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

  10. Linear Discriminant Analysis Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix. Why doesn’t the mean classifier work here? Q1 1 The points are not linearly separable. 2 The covariance matrix is far from the identity matrix.

  11. Linear Discriminant Analysis Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix. Observation 2: Mean Classification works great if the variables really are distributed with unit covariance matrix, but badly otherwise.

  12. Linear Discriminant Analysis Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix. Observation 2: Mean Classification works great if the variables really are distributed with unit covariance matrix, but badly otherwise. Linear Discriminant Analysis (LDA): Implement Observation 1, but using real data covariance matrix!

  13. Linear Discriminant Analysis Linear Discriminant Analysis (LDA): Classify according to a Gaussian likelihood with covariance matrix of the data. Which color does LDA classify this point to? Q2 1 RED 2 Blue

  14. Linear Discriminant Analysis Linear Discriminant Analysis (LDA): Classify according to a Gaussian likelihood with covariance matrix of the data.

  15. Linear Discriminant Analysis Linear Discriminant Analysis: Classify according to Gaussian, That is: Classify x as blue if >

  16. Linear Discriminant Analysis > Q3 Q4 Q5

  17. Linear Discriminant Analysis

  18. Linear Discriminant Analysis Let µ 1 and µ 2 be the two group means in the training set, and Σ the covariance matrix. The linear classifier that classifies each item x to the group with higher Gaussian likelihood under these means and the common covariance matrix, is called Linear Discriminant Analysis . Note: The common covariance matrix is the average squared distance from the mean in each group , not from the total mean!

  19. Example for LDA Treatment Group Control Group ID 5 = (3,1,1) ID 1 = (1,2,1) ID 6 = (4,1,1) ID 2 = (2,1,0) ID 7 = (0,2,0) ID 3 = (1,1,1) ID 8 = (1,0,2) ID 4 = (0,0,2)

  20. Example for LDA Treatment Group Control Group ID 5 = (3,1,1) ID 1 = (1,2,1) ID 6 = (4,1,1) ID 2 = (2,1,0) ID 7 = (0,2,0) ID 3 = (1,1,1) ID 8 = (1,0,2) ID 4 = (0,0,2) 1 Q6 2

  21. Example for LDA Treatment Group Control Group ID 5 = (3,1,1) ID 1 = (1,2,1) ID 6 = (4,1,1) ID 2 = (2,1,0) ID 7 = (0,2,0) ID 3 = (1,1,1) ID 8 = (1,0,2) ID 4 = (0,0,2)

  22. Example for LDA Treatment Group Control Group ID 5 = (3,1,1) ID 1 = (1,2,1) ID 6 = (4,1,1) ID 2 = (2,1,0) ID 7 = (0,2,0) ID 3 = (1,1,1) ID 8 = (1,0,2) ID 4 = (0,0,2)

  23. Example for LDA Treatment Group Control Group ID 5 = (3,1,1) ID 1 = (1,2,1) ID 6 = (4,1,1) ID 2 = (2,1,0) ID 7 = (0,2,0) ID 3 = (1,1,1) ID 8 = (1,0,2) ID 4 = (0,0,2)

  24. Example for LDA Treatment Group Control Group ID 5 = (3,1,1) ID 1 = (1,2,1) ID 6 = (4,1,1) ID 2 = (2,1,0) ID 7 = (0,2,0) ID 3 = (1,1,1) ID 8 = (1,0,2) ID 4 = (0,0,2)

  25. Example for LDA Control Group Treatment Group

  26. Example for LDA Treatment Group Control Group ID 5 = (3,1,1) ID 1 = (1,2,1) ID 6 = (4,1,1) ID 2 = (2,1,0) ID 7 = (0,2,0) ID 3 = (1,1,1) ID 8 = (1,0,2) ID 4 = (0,0,2) Filter

  27. Example for LDA Treatment Group Control Group ID 5 = (3,1,1) 2.1 ID 1 = (1,2,1) -0.1 ID 6 = (4,1,1) 5.4 ID 2 = (2,1,0) 1.5 ID 7 = (0,2,0) -0.7 ID 3 = (1,1,1) -0.7 ID 8 = (1,0,2) -2.1 ID 4 = (0,0,2) -3.5

  28. Example for LDA ID 1 = (1,2,1) ID 5 = (3,1,1) ID 2 = (2,1,0) ID 6 = (4,1,1) ID 3 = (1,1,1) ID 7 = (0,2,0) ID 4 = (0,0,2) ID 8 = (1,0,2) 1 1 0 1

  29. Example for LDA ID 1 = (1,2,1) ID 5 = (3,1,1) ID 2 = (2,1,0) ID 6 = (4,1,1) ID 3 = (1,1,1) ID 7 = (0,2,0) ID 4 = (0,0,2) ID 8 = (1,0,2) 1 1 0 1

  30. Example for LDA ID 1 = (1,2,1) ID 5 = (3,1,1) ID 2 = (2,1,0) ID 6 = (4,1,1) ID 3 = (1,1,1) ID 7 = (0,2,2) ID 4 = (0,0,2) ID 8 = (1,0,0) 1 1 0 1

  31. Example for LDA miss – classified ID 1 = (1,2,1) ID 5 = (3,1,1) ID 2 = (2,1,0) ID 6 = (4,1,1) ID 3 = (1,1,1) ID 7 = (0,2,2) ID 4 = (0,0,2) ID 8 = (1,0,0) 1 1 separation plane 0 1

  32. Geometry of Linear Classifier

  33. Linear Regression x t ID 1 = (1,2,1), -1 ID 2 = (2,1,0), -1 ID 3 = (1,1,1), -1 ID 4 = (0,0,2), -1 ID 5 = (3,1,1), 1 ID 6 = (4,1,1), 1 ID 7 = (0,2,2), 1 ID 8 = (1,0,0), 1 Make it as close as possible minimize:

  34. Linear Regression x t ID 1 = (1,1,2,1), -1 ID 2 = (1,2,1,0), -1 ID 3 = (1,1,1,1), -1 ID 4 = (1,0,0,2), -1 = ID 5 = (1,3,1,1), 1 ID 6 = (1,4,1,1), 1 ID 7 = (1,0,2,2), 1 ID 8 = (1,1,0,0), 1 minimize:

  35. Linear Regression minimize: Minimization is the same as setting the 1 st derivative to zero: =

  36. Answer Form Working with Linear Classifier II 1 2 Q1 1 2 Q2 Q3 Q4 Q5 1 2 Q6

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend