Support Vector Machines (I): Overview and Linear SVM
LING 572 Advanced Statistical Techniques for NLP February 13 2020
1
Support Vector Machines (I): Overview and Linear SVM LING 572 - - PowerPoint PPT Presentation
Support Vector Machines (I): Overview and Linear SVM LING 572 Advanced Statistical Techniques for NLP February 13 2020 1 Why another learning method? Based on some beautifully simple ideas (Schlkopf, 1998) Maximum margin
LING 572 Advanced Statistical Techniques for NLP February 13 2020
1
2
3
4
5
6
7
8
9
10
11
12
13
14
f: X ➔ R
15
16
17
18
separates the two classes?
19
separates the two classes?
20
separates the two classes?
21
22
and closest examples
23
and closest examples
24
and closest examples
25
and closest examples
26
and closest examples
27
28
29
(penalized) errors
30
31
sklearn.svm.SVC.html#sklearn.svm.SVC
32
33
34
35
36
37
38
39
40
subject to
yi(<w,xi>+b) >= 1
41
42
43
44
45
46
47
48
49
50
51
52
c* = arg maxc g(c)
c* = arg maxc ∑i wi δ(c, fi(x))
e.g., wi = 1/dist(x, xi) ➔ We can use all the training examples.
53
<w,x> + b = 0
54
55
56
➔ soft margin
➔ map the data to a higher-dimension space
57
58
59
60
61
62
63
64