Using Sentence-Level LSTM Language Models for Script Inference
Karl Pichotta and Raymond J. Mooney The University of Texas at Austin
- ACL 2016, Berlin
1
Using Sentence-Level LSTM Language Models for Script Inference - - PowerPoint PPT Presentation
Using Sentence-Level LSTM Language Models for Script Inference Karl Pichotta and Raymond J. Mooney The University of Texas at Austin ACL 2016, Berlin 1 Event Inference: Motivation Suppose we want to build a Question Answering
Karl Pichotta and Raymond J. Mooney The University of Texas at Austin
1
system…
2
Troops from the Commune, under General Coffinhal, arrived to free the prisoners and then marched against the Convention itself. –Wikipedia
3
Troops from the Commune, under General Coffinhal, arrived to free the prisoners and then marched against the Convention itself. –Wikipedia
4
Troops from the Commune, under General Coffinhal,
arrived to free the prisoners and then marched
against the Convention itself. –Wikipedia
5
Troops from the Commune, under General Coffinhal,
arrived to free the prisoners and then marched
against the Convention itself. –Wikipedia
6
Troops from the Commune, under General Coffinhal,
arrived to free the prisoners and then marched
against the Convention itself. –Wikipedia
7
implicit events.
8
9
10
11
12
account of scripts (events in sequence).
statistical model of (verb, dependency) events.
models of event sequences [e.g. P. & Mooney (AAAI 2016)].
13
14
Millions
Documents NLP Pipeline
Millions of Event Sequences Train a Statistical Model
15
New Test Document NLP Pipeline
Single Event Sequence Query Trained Statistical Model Inferred Probable Events
16
New Test Document Single Event Sequence Query Trained Statistical Model Inferred Probable Events
17
New Test Document Single Text Sequence Query Trained Statistical Model Inferred Probable Events
18
New Test Document Single Text Sequence Query Trained Statistical Model Inferred Probable Text
19
New Test Document Single Text Sequence Query Trained Statistical Model Inferred Probable Text Parse Events from Text
20
New Test Document Single Text Query Trained Statistical Model Inferred Probable Text Parse Events from Text
21
22
vectors…
23
24
RNN ti
[word sequence for sentence i]
ti+1
[word sequence for sentence i+1]
RNN ti-1
language models infer events?
25
26
27
28
29
…and optionally expanding into text.
…and optionally parsing into events.
30
…and optionally expanding into text.
…and optionally parsing into events.
31
32
jumped(jim, from plane);
Predict an event from a sequence of events. LSTM landed(jim, on ground) LSTM “Jim landed on the ground.”
≈ [P. & Mooney (2016)]
33
“Jim jumped from the plane and
Predict text from text. LSTM “Jim landed on the ground.” Parser landed(jim, on ground)
≈ [Kiros et al. 2015]
34
35
with momentum.
36
event, judge a system on inferring it.
documents is the top inference the gold standard answer?”
the same as in the gold standard?”
37
event.
38
39
Accuracy (%)
Most common e1 -> e2 t1 -> t2 -> e2 0.75 1.5 2.25 3
2 2.3 0.2 Partial Credit (%)
Most common e1 -> e2 t1 -> t2 -> e2 7.75 15.5 23.25 31
30.3 26.7 26.5
precisions.
40
successor.
41
42
BLEU
t1 -> t1 e1 -> e2 -> t2 t1 -> t2 1.5 3 4.5 6
5.2 0.34 1.88 1-BLEU
t1 -> t1 e1 -> e2 -> t2 t1 -> t2 8 16 24 32
30.9 19.9 22.6
event models.
models.
43
him.”
thought the pistol had accidentally discharged and that he did not believe that Curly Bill shot him on purpose.”
44
its company name to Panasonic Corporation.”
‘National’ in Japan are currently marketed under the ‘Panasonic’ brand.”
45
representation as events (and doesn’t require a parser!).
NLP tasks is an exciting open question.
46
47