Data Classification Linear Classifier II Latent Differential - - PowerPoint PPT Presentation

data classification
SMART_READER_LITE
LIVE PREVIEW

Data Classification Linear Classifier II Latent Differential - - PowerPoint PPT Presentation

Data Classification Linear Classifier II Latent Differential Analysis Mean Classification Memory If youre here, you are RED If youre here, you are BLUE 2 Back Linear Classifier A classifier that assigns a class to a new point


slide-1
SLIDE 1

Data Classification

Linear Classifier II

Latent Differential Analysis

slide-2
SLIDE 2

Mean Classification

Memory 2 – Back If you’re here, you are BLUE If you’re here, you are RED

slide-3
SLIDE 3

Linear Classifier

A classifier that assigns a class to a new point based on a separation hyperplane is called a linear classifier. The criterion for a linear classifier can be written as vector product, ie., there is a vector w and a number c such that a new data vector x is classified as being in group one exactly if

slide-4
SLIDE 4

Limitations of Mean Classifier

Memory 2 – Back LOO accuracy: 66.6 %

slide-5
SLIDE 5

Linear Classifier Works

Memory 2 – Back LOO performance: 100 %

slide-6
SLIDE 6

Linear Discriminant Analysis

Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

slide-7
SLIDE 7

Linear Discriminant Analysis

Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

slide-8
SLIDE 8

Linear Discriminant Analysis

Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

slide-9
SLIDE 9

Linear Discriminant Analysis

Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

slide-10
SLIDE 10

Linear Discriminant Analysis

Q1

1 2

Why doesn’t the mean classifier work here? The points are not linearly separable. The covariance matrix is far from the identity matrix. Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix.

slide-11
SLIDE 11

Linear Discriminant Analysis

Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix. Observation 2: Mean Classification works great if the variables really are distributed with unit covariance matrix, but badly otherwise.

slide-12
SLIDE 12

Linear Discriminant Analysis

Linear Discriminant Analysis (LDA): Implement Observation 1, but using real data covariance matrix! Observation 1: Mean Classification is equivalent to classifying according to a Gaussian likelihood with identity as covariance matrix. Observation 2: Mean Classification works great if the variables really are distributed with unit covariance matrix, but badly otherwise.

slide-13
SLIDE 13

Linear Discriminant Analysis

Linear Discriminant Analysis (LDA): Classify according to a Gaussian likelihood with covariance matrix of the data.

Q2

1 2

Which color does LDA classify this point to? RED Blue

slide-14
SLIDE 14

Linear Discriminant Analysis

Linear Discriminant Analysis (LDA): Classify according to a Gaussian likelihood with covariance matrix of the data.

slide-15
SLIDE 15

Linear Discriminant Analysis

Linear Discriminant Analysis: Classify according to Gaussian, That is: Classify x as blue if

>

slide-16
SLIDE 16

Linear Discriminant Analysis

>

Q3 Q4 Q5

slide-17
SLIDE 17

Linear Discriminant Analysis

slide-18
SLIDE 18

Linear Discriminant Analysis

Let µ1 and µ2 be the two group means in the training set, and Σ the covariance

  • matrix. The linear classifier that classifies each item x to the group with higher

Gaussian likelihood under these means and the common covariance matrix, is called Linear Discriminant Analysis. Note: The common covariance matrix is the average squared distance from the mean in each group, not from the total mean!

slide-19
SLIDE 19

Example for LDA

Control Group Treatment Group ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2)

slide-20
SLIDE 20

Example for LDA

Control Group Treatment Group ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2)

Q6

1 2

slide-21
SLIDE 21

Example for LDA

Control Group Treatment Group ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2)

slide-22
SLIDE 22

Example for LDA

Control Group Treatment Group ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2)

slide-23
SLIDE 23

Example for LDA

Control Group Treatment Group ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2)

slide-24
SLIDE 24

Example for LDA

Control Group Treatment Group ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2)

slide-25
SLIDE 25

Example for LDA

Control Group Treatment Group

slide-26
SLIDE 26

Example for LDA

Control Group Treatment Group ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2) Filter

slide-27
SLIDE 27

Example for LDA

Control Group Treatment Group ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2)

  • 0.1

1.5

  • 0.7
  • 3.5

2.1 5.4

  • 0.7
  • 2.1
slide-28
SLIDE 28

Example for LDA

1 1 ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2) 1

slide-29
SLIDE 29

Example for LDA

1 1 1 ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,0) ID 8 = (1,0,2)

slide-30
SLIDE 30

Example for LDA

1 1 1 ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,2) ID 8 = (1,0,0)

slide-31
SLIDE 31

Example for LDA

1 1 1 ID 1 = (1,2,1) ID 2 = (2,1,0) ID 3 = (1,1,1) ID 4 = (0,0,2) ID 5 = (3,1,1) ID 6 = (4,1,1) ID 7 = (0,2,2) ID 8 = (1,0,0) separation plane miss – classified

slide-32
SLIDE 32

Geometry of Linear Classifier

slide-33
SLIDE 33

Linear Regression

ID 1 = (1,2,1), -1 ID 2 = (2,1,0), -1 ID 3 = (1,1,1), -1 ID 4 = (0,0,2), -1 ID 5 = (3,1,1), 1 ID 6 = (4,1,1), 1 ID 7 = (0,2,2), 1 ID 8 = (1,0,0), 1 x t Make it as close as possible minimize:

slide-34
SLIDE 34

Linear Regression

ID 1 = (1,1,2,1), -1 ID 2 = (1,2,1,0), -1 ID 3 = (1,1,1,1), -1 ID 4 = (1,0,0,2), -1 ID 5 = (1,3,1,1), 1 ID 6 = (1,4,1,1), 1 ID 7 = (1,0,2,2), 1 ID 8 = (1,1,0,0), 1 x t minimize: =

slide-35
SLIDE 35

Linear Regression

minimize: Minimization is the same as setting the 1st derivative to zero: =

slide-36
SLIDE 36

Answer Form Working with Linear Classifier II

1 2

Q2 Q1 Q3 Q4 Q6 Q5

1 2 1 2