SLIDE 1
Classification K-nearest neighbor classification D istance functions - - PowerPoint PPT Presentation
Classification K-nearest neighbor classification D istance functions - - PowerPoint PPT Presentation
Classification K-nearest neighbor classification D istance functions Choice of k Choice of k Leave-one-out cross validation K-fold cross validation Classification Error = Average classification error on K folds Linear Classification Linear
SLIDE 2
SLIDE 3
K-nearest neighbor classification
SLIDE 4
Distance functions
SLIDE 5
Choice of k
SLIDE 6
Choice of k
SLIDE 7
SLIDE 8
Leave-one-out cross validation
SLIDE 9
K-fold cross validation
Classification Error = Average classification error on K folds
SLIDE 10
Linear Classification
SLIDE 11
Linear separability
SLIDE 12
- Real world problems: there may not exist a hyperplane
that separates cleanly
- Solution to this “inseparability” problem: map data to
higher dimensional space
- Called the “feature space”, as opposed to the original “input
space”
- Inseparable training set can be made separable with proper
choice of feature space
Inseparability
SLIDE 13
Feature map
SLIDE 14
Linear classifier
SLIDE 15
Linear classifier
SLIDE 16
Good and bad linear classifiers
SLIDE 17
Support Vector Machine
Two popular implementations
SLIDE 18
Margin
SLIDE 19
Margin
SLIDE 20
Linear Support Vector Machine
SLIDE 21
Inseparable case
SLIDE 22