Natural Language Processing (CSE 517): Text Classification (II)
Noah Smith
c 2016 University of Washington nasmith@cs.washington.edu
February 1, 2016
1 / 17
Natural Language Processing (CSE 517): Text Classification (II) - - PowerPoint PPT Presentation
Natural Language Processing (CSE 517): Text Classification (II) Noah Smith 2016 c University of Washington nasmith@cs.washington.edu February 1, 2016 1 / 17 Quick Review: Text Classification Input: a piece of text x V , usually a
1 / 17
2 / 17
3 / 17
−4 −2 2 4 1 2 3 4 5 score loss
4 / 17
◮ Pick it uniformly at random from {1, . . . , n}. ◮ ˆ
◮ w ← w − α
6 / 17
7 / 17
8 / 17
−4 −2 2 4 1 2 3 4 5 6 x function(x) −x + pmax(x, 1)
9 / 17
10 / 17
11 / 17
12 / 17
13 / 17
◮ Example:
2
◮ Linear kernels are most common in NLP. 14 / 17
◮ Lexicon features can provide problem-specific guidance. 15 / 17
◮ Lexicon features can provide problem-specific guidance.
◮ You should have a basic understanding of the tradeoffs in
16 / 17
◮ Lexicon features can provide problem-specific guidance.
◮ You should have a basic understanding of the tradeoffs in
17 / 17
◮ Lexicon features can provide problem-specific guidance.
◮ You should have a basic understanding of the tradeoffs in
18 / 17
19 / 17
20 / 17