Machine Learning
Lecture 5 Support Vector Machines Justin Pearson1 2020
1http://user.it.uu.se/~justin/Teaching/MachineLearning/index.html 1 / 33
Machine Learning Lecture 5 Support Vector Machines Justin Pearson 1 - - PowerPoint PPT Presentation
Machine Learning Lecture 5 Support Vector Machines Justin Pearson 1 2020 1 http://user.it.uu.se/~justin/Teaching/MachineLearning/index.html 1 / 33 Separating Hyperplanes Logistic regression (with linear features) finds a hyperplane that
1http://user.it.uu.se/~justin/Teaching/MachineLearning/index.html 1 / 33
1 2 3 4 5 6 7 8 x 1 2 3 4 5 6 7 8 y 1 2 / 33
1 2 3 4 5 6 7 8 x 1 2 3 4 5 6 7 8 y 1 3 / 33
4 5 6 7 8 9 10 10 8 6 4 2 4 / 33
5 / 33
6 / 33
7 / 33
8 / 33
10 5 5 10 Weighted Input 2 4 6 8 10 Error 10 5 5 10 Weighted Input 0.0 0.2 0.4 0.6 0.8 1.0 Simoid
9 / 33
10 5 5 10 Weighted Input 2 4 6 8 10 Error 10 5 5 10 Weighted Input 2 4 6 8 10 Approximation
10 / 33
11 / 33
12 / 33
2Picture taken from wikipedia 13 / 33
14 / 33
15 / 33
16 / 33
17 / 33
18 / 33
1 2 3 4 5 6 7 8 x 1 2 3 4 5 6 7 8 y 1
19 / 33
20 / 33
21 / 33
22 / 33
3Actually H stands for Hilbert not higher, but do not worry about this. 23 / 33
24 / 33
25 / 33
4Don’t worry if your head hurts. 26 / 33
27 / 33
28 / 33
29 / 33
4 2 2 4 0.0 0.2 0.4 0.6 0.8 1.0
30 / 33
5.0 2.5 0.0 2.5 5.0 7.5 10.0 12.5 15.0 0.0 0.2 0.4 0.6 0.8 1.0
31 / 33
5.0 2.5 0.0 2.5 5.0 7.5 10.0 12.5 15.0 0.0 0.2 0.4 0.6 0.8 1.0
32 / 33
33 / 33