Building Adaptable and Scalable Natural Language Generation Systems
Yannis Konstas
Building Adaptable and Scalable Natural Language Generation Systems - - PowerPoint PPT Presentation
Building Adaptable and Scalable Natural Language Generation Systems Yannis Konstas Natural Language Generation is everywhere (Machine Translation)
Building Adaptable and Scalable Natural Language Generation Systems
Yannis Konstas
Human:
Natural Language Generation is everywhere
(Machine Translation)
Input:
Ο πρόεδρος των ΗΠΑ Ντόναλντ Τραμπ γνωστοποίησε ότι δεν θα πάει στο ετήσιο δείπνο της Ένωσης Ανταποκριτών Λευκού Οίκου (WHCA) στα τέλη του Απριλίου. The president of the United States Donald Trump announced that he would not go to the annual dinner
in late April.
Human:
Natural Language Generation is everywhere
(Machine Translation)
Input:
Ο πρόεδρος των ΗΠΑ Ντόναλντ Τραμπ γνωστοποίησε ότι δεν θα πάει στο ετήσιο δείπνο της Ένωσης Ανταποκριτών Λευκού Οίκου (WHCA) στα τέλη του Απριλίου. The president of the United States Donald Trump announced that he would not go to the annual dinner
in late April. The US president Donald Trump announced that it would
go to the annual dinner of White House Correspondents
Union (WHCA) in late April.
Natural Language Generation is everywhere
(Dialogue Systems)
What is the weather going to be like in Sitka What is the weather going to be like in Chicago
Natural Language Generation is everywhere
(Dialogue Systems)
What is the weather going to be like in Sitka What is the weather going to be like in Chicago No I meant Chicago
Natural Language Generation is everywhere
(Dialogue Systems)
What is the weather going to be like in Sitka What is the weather going to be like in Chicago No I meant Chicago How about on Tuesday
Natural Language Generation is everywhere
(Conversational Agents) …or when things get too emotional
(Harsley et al., CSCW 2016)
Natural Language Generation is everywhere
(Educational Technology)
(Krause et al., CVPR 2017)
Natural Language Generation is everywhere
(Caption Generation)
A man swinging a bat.
(Krause et al., CVPR 2017)
Natural Language Generation is everywhere
(Caption Generation)
A baseball player is swinging a bat. He is wearing a red helmet and a white shirt. The catcher’s mitt is behind the batter. A man swinging a bat.
Concept-to-Text Text Summarization Machine Translation Dialogue Systems Conversational Agents Code to Language Storytelling Captions Instructional Text Educational Technology Meaning Representations Human-Robot Interaction
Concept-to-Text Text Summarization Machine Translation Dialogue Systems Conversational Agents Code to Language Storytelling Captions Instructional Text Educational Technology Meaning Representations Human-Robot Interaction
Concept-to-Text Text Summarization Machine Translation Dialogue Systems Conversational Agents Code to Language Storytelling Captions Instructional Text Educational Technology Meaning Representations Human-Robot Interaction
Concept-to-Text Text Summarization Machine Translation Dialogue Systems Conversational Agents Code to Language Storytelling Captions Instructional Text Educational Technology Meaning Representations Human-Robot Interaction
Concept-to-Text Text Summarization Machine Translation Dialogue Systems Conversational Agents Code to Language Storytelling Captions Instructional Text Educational Technology Meaning Representations Human-Robot Interaction
Input Text
know I planet lazy inhabit man
min mean max mod e wind 10 15 20 dir W temp 50 60 72 gust 5 10 13
public int TextWidth (string text) { TextBlock t = new TextBlock(); t.Text = text; return (int) Math.Ceiling(t.ActualWidth); }
20x + 5y = γ
High quality source code is often paired with high level summaries of the computation it performs, for example in code documentation or in descriptions posted in online forums.
Machine Translation Concept-to-Text Human-Robot Interaction Educational Technology Meaning Representations Code to Language
know I planet lazy inhabit man
min mean max mod e wind 10 15 20 dir W temp 50 60 72 gust 5 10 13
public int TextWidth (string text) { TextBlock t = new TextBlock(); t.Text = text; return (int) Math.Ceiling(t.ActualWidth); }
20x + 5y = γ
Place the heineken block west
Overcast, with a high of 70. Moderate westerly winds, with gusts as high as 13 mph. I know the planet is inhabited by a lazy man. Tammy bought 20 apples and 5 oranges. How many fruits does she have now? Get rendered width of string rounded up to the nearest integer.
High quality source code is often paired with high level summaries of the computation it performs, for example in code documentation or in descriptions posted in online forums.
⾼髙品質のソースコードは、コードドキュメ ントやオンラインフォーラムに掲載された 説明など、実⾏行降する計算のハイレベルの要 約と対になることがよくあります。
Machine Translation Concept-to-Text Human-Robot Interaction Educational Technology Meaning Representations Code to Language
Successes
Challenges
Successes
Challenges
large corpora without extra annotation
large corpora without extra annotation
Joint work with Srinivasan Iyer, Mark Yatskar Luke Zettlemoyer, Yejin Choi
(Flanigan et al, NAACL 2016, Pourdamaghani and Knight, INLG 2016, Song et al, EMNLP 2016)
Input: Graph Structure
(Abstract Meaning Representation - AMR; Banarescu et al., 2013)
know I planet lazy ARG0 ARG1 inhabit man ARG1-of ARG0 mod
I knew a planet that was inhabited by a lazy man. I have known a planet that was inhabited by a lazy man. I know a planet. It is inhabited by a lazy man.
(Konstas, Iyer, Yatskar, Choi, Zettlemoyer, ACL 2017, to Appear)
(Flanigan et al, NAACL 2016, Pourdamaghani and Knight, INLG 2016, Song et al, EMNLP 2016)
Input: Graph Structure
(Abstract Meaning Representation - AMR; Banarescu et al., 2013)
know I planet lazy ARG0 ARG1 inhabit man ARG1-of ARG0 mod
I knew a planet that was inhabited by a lazy man. I have known a planet that was inhabited by a lazy man. I know a planet. It is inhabited by a lazy man.
(Konstas, Iyer, Yatskar, Choi, Zettlemoyer, ACL 2017, to Appear)
know
(Flanigan et al, NAACL 2016, Pourdamaghani and Knight, INLG 2016, Song et al, EMNLP 2016)
Input: Graph Structure
(Abstract Meaning Representation - AMR; Banarescu et al., 2013)
know I planet lazy ARG0 ARG1 inhabit man ARG1-of ARG0 mod
I knew a planet that was inhabited by a lazy man. I have known a planet that was inhabited by a lazy man. I know a planet. It is inhabited by a lazy man.
(Konstas, Iyer, Yatskar, Choi, Zettlemoyer, ACL 2017, to Appear)
know I
(Flanigan et al, NAACL 2016, Pourdamaghani and Knight, INLG 2016, Song et al, EMNLP 2016)
Input: Graph Structure
(Abstract Meaning Representation - AMR; Banarescu et al., 2013)
know I planet lazy ARG0 ARG1 inhabit man ARG1-of ARG0 mod
I knew a planet that was inhabited by a lazy man. I have known a planet that was inhabited by a lazy man. I know a planet. It is inhabited by a lazy man.
(Konstas, Iyer, Yatskar, Choi, Zettlemoyer, ACL 2017, to Appear)
know I planet
Encoder
input
Encoder Decoder
input
…
inhabit inhabited was …ˆ w = argmax
w
Y
i
p
Attention Encoder Decoder
input
…
inhabit inhabited was … know ARG0 I ARG1 ( planet ARG1-of inhabit<s> I know the planet
ˆ w = argmax
w
Y
i
p
Graph —> Depth First Search
hold person meet group ARG0 ARG1 person expert ARG1-of have-role country “United States”
date-entity city “New York” 2002 1 time location name ARG1 name ARG2-of ARG0-of ARG2 year month ARG0
US officials held an expert group meeting in January 2002 in New York .
Graph —> Depth First Search
hold :ARG0 (person :ARG0-of (have-role :ARG1 United_States :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity 2002 1) :location New_York
hold person meet group ARG0 ARG1 person expert ARG1-of have-role country “United States”
date-entity city “New York” 2002 1 time location name ARG1 name ARG2-of ARG0-of ARG2 year month ARG0
US officials held an expert group meeting in January 2002 in New York .
Linearize —> RNN encoding
hold :ARG0 (person :ARG0-of (have-role :ARG1 United_States :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity 2002 1) :location New_York
Linearize —> RNN encoding
hold :ARG0 (person :ARG0-of (have-role :ARG1 United_States :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity 2002 1) :location New_York
Linearize —> RNN encoding
hold ARG0 ( person ARG0-of
hold :ARG0 (person :ARG0-of (have-role :ARG1 United_States :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity 2002 1) :location New_York
Linearize —> RNN encoding
hold ARG0 ( person ARG0-of
h1(s) h2(s) h3(s) h4(s) h5(s)
hold :ARG0 (person :ARG0-of (have-role :ARG1 United_States :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity 2002 1) :location New_York
Linearize —> RNN encoding
hold ARG0 ( person ARG0-of
h1(s) h2(s) h3(s) h4(s) h5(s)
hold :ARG0 (person :ARG0-of (have-role :ARG1 United_States :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity 2002 1) :location New_York
Linearize —> RNN encoding
hold ARG0 ( person ARG0-of
h1(s) h2(s) h3(s) h4(s) h5(s)
[ ] [ ] [ ] [ ] [ ]
h1(s) h2(s) h3(s) h4(s) h5(s)
hold :ARG0 (person :ARG0-of (have-role :ARG1 United_States :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity 2002 1) :location New_York
Linearize —> RNN encoding
hold ARG0 ( person ARG0-of
h1(s) h2(s) h3(s) h4(s) h5(s)
[ ] [ ] [ ] [ ] [ ]
h1(s) h2(s) h3(s) h4(s) h5(s)
hold :ARG0 (person :ARG0-of (have-role :ARG1 United_States :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity 2002 1) :location New_York
h1 hN(s)
RNN Encoding —> RNN Decoding (Beam search)
h1 hN(s)
∅
RNN Encoding —> RNN Decoding (Beam search)
h1 hN(s)
∅
Holding Held US …
RNN Encoding —> RNN Decoding (Beam search)
h1 hN(s)
∅
Holding Held US …
h2
a the meeting …
…
w11:Holding
Helds
w12:
Hold
w13:
US
w14:
RNN Encoding —> RNN Decoding (Beam search)
h1 hN(s)
∅
Holding Held US …
h2
a the meeting …
h3
US person expert …
…
…
w11:Holding
Helds
w12:
Hold
w13:
US
w14:
… Hold a
w21:
Hold the
w22:
Held a
w23:
Held the
w24:
RNN Encoding —> RNN Decoding (Beam search)
h1 hN(s)
∅
Holding Held US …
h2
a the meeting …
h3
US person expert …
hk
…
…
w11:Holding
Helds
w12:
Hold
w13:
US
w14:
… Hold a
w21:
Hold the
w22:
Held a
w23:
Held the
w24: wk1: The
US
wk2:
US
wk3:
US
wk4:
US
…
meeting meetings meet …
RNN Encoding —> RNN Decoding (Beam search)
h2 h3
a the meeting …
w2: held
h3
a the meeting …
w2: held
hold ARG0 ( person ARG0-of
[ ] [ ] [ ] [ ] [ ]
h1(s) h2(s) h3(s) h4(s) h5(s) c3
h3
a the meeting …
w2: held
hold ARG0 ( person ARG0-of
[ ] [ ] [ ] [ ] [ ]
h1(s) h2(s) h3(s) h4(s) h5(s) c3
ci = X
i
aijh(s)
j
ai = soft max
hold ARG0 ( person role US
meet expert group )
US
held an expert group meeting in January 2002
h3
a the meeting …
w2: held
hold ARG0 ( person ARG0-of
[ ] [ ] [ ] [ ] [ ]
h1(s) h2(s) h3(s) h4(s) h5(s) c3
ci = X
i
aijh(s)
j
ai = soft max
hold ARG0 ( person role US
meet expert group )
US
held an expert group meeting in January 2002
h3
a the meeting …
w2: held
hold ARG0 ( person ARG0-of
[ ] [ ] [ ] [ ] [ ]
h1(s) h2(s) h3(s) h4(s) h5(s) c3
ci = X
i
aijh(s)
j
ai = soft max
hold ARG0 ( person role US
meet expert group )
US
held an expert group meeting in January 2002
h3
a the meeting …
w2: held
hold ARG0 ( person ARG0-of
[ ] [ ] [ ] [ ] [ ]
h1(s) h2(s) h3(s) h4(s) h5(s) c3
ci = X
i
aijh(s)
j
ai = soft max
hold ARG0 ( person role US
meet expert group )
US
held an expert group meeting in January 2002
h3
a the meeting …
w2: held
hold ARG0 ( person ARG0-of
[ ] [ ] [ ] [ ] [ ]
h1(s) h2(s) h3(s) h4(s) h5(s) c3
ci = X
i
aijh(s)
j
ai = soft max
Linearization —> Anonymization
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
hold person meet group ARG0 ARG1 person expert ARG1-of have-role country “United States”
date-entity city “New York” 2002 1 time location name ARG1 name ARG2-of ARG0-of ARG2 year month ARG0
US officials held an expert group meeting in January 2002 in New York .
Linearization —> Anonymization
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
hold person meet group ARG0 ARG1 person expert ARG1-of have-role country “United States”
date-entity city “New York” 2002 1 time location name ARG1 name ARG2-of ARG0-of ARG2 year month ARG0
US officials held an expert group meeting in January 2002 in New York .
country “United States”
Linearization —> Anonymization
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
hold person meet group ARG0 ARG1 person expert ARG1-of have-role country “United States”
date-entity city “New York” 2002 1 time location name ARG1 name ARG2-of ARG0-of ARG2 year month ARG0
US officials held an expert group meeting in January 2002 in New York .
country “United States” 2002 1
Linearization —> Anonymization
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
hold person meet group ARG0 ARG1 person expert ARG1-of have-role country “United States”
date-entity city “New York” 2002 1 time location name ARG1 name ARG2-of ARG0-of ARG2 year month ARG0
US officials held an expert group meeting in January 2002 in New York .
country “United States” 2002 1 “New York” city
Linearization —> Anonymization
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
hold person meet group ARG0 ARG1 person expert ARG1-of have-role country “United States”
date-entity city “New York” 2002 1 time location name ARG1 name ARG2-of ARG0-of ARG2 year month ARG0
US officials held an expert group meeting in January 2002 in New York .
country “United States” 2002 1 “New York” city
loc_0 officials held an expert group meeting in month_0 year_0 in loc_1 .
AMR LDC2015E86 (SemEval-2016 Task 8)
Train
Evaluation
(Papineni et al., ACL 2002)
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
BLEU 5.8 11.6 17.4 23.2 29
TreeToStr TSP PBMT NNLG
26.9 22.4 23
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
BLEU 5.8 11.6 17.4 23.2 29
TreeToStr TSP PBMT NNLG
22 26.9 22.4 23
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
BLEU 5.8 11.6 17.4 23.2 29
TreeToStr TSP PBMT NNLG
22 26.9 22.4 23
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
All systems use a Language Model trained on a very large corpus. We will emulate via data augmentation.
(Sennrich et al., ACL 2016)
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
US officials held an expert group meeting in January 2002 in New York . United States officials held held a meeting in January 2002 .
Reference Prediction
44.26% 74.85%
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
US officials held an expert group meeting in January 2002 in New York . United States officials held held a meeting in January 2002 .
Reference Prediction
44.26% 74.85%
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
US officials held an expert group meeting in January 2002 in New York . United States officials held held a meeting in January 2002 .
Reference Prediction
44.26% 74.85%
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
US officials held an expert group meeting in January 2002 in New York . United States officials held held a meeting in January 2002 .
Reference Prediction
a) Sparsity
Tokens 4500 9000 13500 18000
Total OOV@1 OOV@5 44.26% 74.85%
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
US officials held an expert group meeting in January 2002 in New York . United States officials held held a meeting in January 2002 .
Reference Prediction
a) Sparsity b) Avg sent length: 20 words c) Limited Language Modeling capacity
Tokens 4500 9000 13500 18000
Total OOV@1 OOV@5 44.26% 74.85%
Original Dataset: ~16k graph-sentence pairs
Original Dataset: ~16k graph-sentence pairs
Gigaword: ~183M sentences *only*
Original Dataset: ~16k graph-sentence pairs
Gigaword: ~183M sentences *only* Sample sentences with vocabulary overlap
% 20 40 60 80 OOV@1 OOV@5
Original Giga-200k Giga-2M
graph text Generate from MR
Attention
Encoder Decoder
graph text
graph text Generate from MR Parse to MR
Attention
Encoder Decoder
graph text
Attention
Encoder Decoder
text graph
graph text Generate from MR Parse to MR
Attention
Encoder Decoder
graph text
Attention
Encoder Decoder
text graph
graph text Generate from MR Parse to MR
Attention
Encoder Decoder
graph text
Attention
Encoder Decoder
text graph
Re-train
input text Generate from Input Parse to Input
Train MR Parser P on Original Dataset
( , )
for i = 0 … N
Train MR Parser P on Original Dataset
( , )
for i = 0 … N
Si =Sample k 10i sentences from Gigaword
Train MR Parser P on Original Dataset
( , )
for i = 0 … N
Parse Si sentences with P
Si =Sample k 10i sentences from Gigaword
Train MR Parser P on Original Dataset
( , )
Self-train Parser
for i = 0 … N
Parse Si sentences with P
Si =Sample k 10i sentences from Gigaword
Re-train MR Parser P on Si Train MR Parser P on Original Dataset
( , )
Self-train Parser
for i = 0 … N
Parse Si sentences with P
Si =Sample k 10i sentences from Gigaword
Re-train MR Parser P on Si Train MR Parser P on Original Dataset
( , )
Self-train Parser
for i = 0 … N
Parse Si sentences with P
Si =Sample k 10i sentences from Gigaword
Re-train MR Parser P on Si Train MR Parser P on Original Dataset
( , )
Train Generator G on SN
( , )
Train P on Original Dataset
Train P on Original Dataset
Sample S1=200k sentences from Gigaword Train P on Original Dataset
200k
Sample S1=200k sentences from Gigaword Parse S1 with P Train P on Original Dataset
200k 200k
( , )
Sample S1=200k sentences from Gigaword Train P on S1=200k Parse S1 with P Train P on Original Dataset
200k 200k 200k
( , )
Sample S1=200k sentences from Gigaword Train P on S1=200k Fine-tune P on Original Dataset Parse S1 with P Train P on Original Dataset
200k 200k
Fine-tune: init parameters from previous step and train on Original Dataset
200k
( , )
Train P on S2=2M Fine-tune P on Original Dataset Sample S2=2M sentences from Gigaword Parse S2 with P
200k 200k 200k
Fine-tune: init parameters from previous step and train on Original Dataset
( , )
Train P on S2=2M Fine-tune P on Original Dataset Sample S2=2M sentences from Gigaword Parse S2 with P
200k 2M 2M 2M
Fine-tune: init parameters from previous step and train on Original Dataset
( , )
Train G on S3=2M Fine-tune G on Original Dataset Sample S3=2M sentences from Gigaword Parse S3 with P
200k 2M 2M 2M
Fine-tune: init parameters from previous step and train on Original Dataset
( , )
Train G on S3=2M Fine-tune G on Original Dataset Sample S3=2M sentences from Gigaword Parse S3 with P
2M 2M 2M 2M G G
Fine-tune: init parameters from previous step and train on Original Dataset
( , )
Train G on S3=2M Fine-tune G on Original Dataset Sample S3=2M sentences from Gigaword Parse S3 with P
2M 2M 2M 2M G G
Fine-tune: init parameters from previous step and train on Original Dataset
( , )
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
BLEU 7 14 21 28 35
TreeToStr TSP PBMT NNLG NNLG-200k NNLG-2M NNLG-20M
22 26.9 22.4 23
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
BLEU 7 14 21 28 35
TreeToStr TSP PBMT NNLG NNLG-200k NNLG-2M NNLG-20M
27.4 22 26.9 22.4 23
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
BLEU 7 14 21 28 35
TreeToStr TSP PBMT NNLG NNLG-200k NNLG-2M NNLG-20M
32.3 27.4 22 26.9 22.4 23
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
BLEU 7 14 21 28 35
TreeToStr TSP PBMT NNLG NNLG-200k NNLG-2M NNLG-20M
34.06 32.3 27.4 22 26.9 22.4 23
TreeToStr: Flanigan et al, NAACL 2016 TSP: Song et al, EMNLP 2016 PBMT: Pourdamaghani and Knight, INLG 2016
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
US officials held an expert group meeting in January 2002 in New York . In January 2002 United States officials held a meeting of the group experts in New York .
Reference Prediction
44.26% 74.85%
Errors: Disfluency Coverage
hold :ARG0 (person :ARG0-of (have-role :ARG1 loc_0 :ARG2 official) ) :ARG1 (meet :ARG0 (person :ARG1-of expert :ARG2-of group) ) :time (date-entity year_0 month_0) :location loc_1
US officials held an expert group meeting in January 2002 in New York . In January 2002 United States officials held a meeting of the group experts in New York .
Reference Prediction
44.26% 74.85%
The report stated British government must help to stabilize weak states and push for international regulations that would stop terrorists using freely available information to create and unleash new forms of biological warfare such as a modified version of the influenza virus.
Reference
The report stated that the Britain government must help stabilize the weak states and push international regulations to stop the use of freely available information to create a form of new biological warfare such as the modified version
Prediction
Errors: Disfluency Coverage
Meaning Representation of Natural Language Programming Language
Joint work with Srinivasan Iyer Luke Zettlemoyer, Alvin Cheung
Input: Source Code
(SQL - C#)
(Summarizing Source Code using a Neural Attention Model. Iyer, Konstas, Cheung, Zettlemoyer, ACL 2016) public int TextWidth (string text) { TextBlock t = new TextBlock(); t.Text = text; return (int) Math.Ceiling(t.ActualWidth); }
Get rendered width of string rounded up to the nearest integer.
Output: Summary
Input: Source Code
(SQL - C#)
(Summarizing Source Code using a Neural Attention Model. Iyer, Konstas, Cheung, Zettlemoyer, ACL 2016) public int TextWidth (string text) { TextBlock t = new TextBlock(); t.Text = text; return (int) Math.Ceiling(t.ActualWidth); }
Get rendered width of string rounded up to the nearest integer.
Output: Summary
SELECT max(marks) FROM stud_records WHERE marks < (SELECT max(marks) FROM stud_records);
How to find the second largest value from a table?
1) Code snippet —>Linearize (left-to-right)
(Summarizing Source Code using a Neural Attention Model. Iyer, Konstas, Cheung, Zettlemoyer, ACL 2016)
SELECT max(marks) FROM stud_records WHERE marks < (SELECT max(marks) FROM stud_records);
How to find the second largest value from a table?
1) Code snippet —>Linearize (left-to-right)
(Summarizing Source Code using a Neural Attention Model. Iyer, Konstas, Cheung, Zettlemoyer, ACL 2016)
SELECT max(marks) FROM stud_records WHERE marks < (SELECT max(marks) FROM stud_records); SELECT max(col0) FROM tab0 WHERE col0 < (SELECT max(col1) FROM tab1);
How to find the second largest value from a table?
2) Anonymize
1) Code snippet —>Linearize (left-to-right)
(Summarizing Source Code using a Neural Attention Model. Iyer, Konstas, Cheung, Zettlemoyer, ACL 2016)
SELECT max(marks) FROM stud_records WHERE marks < (SELECT max(marks) FROM stud_records); SELECT max(col0) FROM tab0 WHERE col0 < (SELECT max(col1) FROM tab1);
How to find the second largest value from a table?
SELECT max col0 FROM tab0
h2(s) h3(s) h4(s) h5(s) h1(s)
2) Anonymize 3) Bag of Words Encoding
4) Bag of Words Encoding —>RNN Decoding 5) Attention directly on input embeddings
(Summarizing Source Code using a Neural Attention Model. Iyer, Konstas, Cheung, Zettlemoyer, ACL 2016)
SELECT max col0 FROM tab0
h2(s) h3(s) h4(s) h5(s) c3 h1(s) h3
value the col_0 …
w2: largest
4) Bag of Words Encoding —>RNN Decoding 5) Attention directly on input embeddings
(Summarizing Source Code using a Neural Attention Model. Iyer, Konstas, Cheung, Zettlemoyer, ACL 2016)
SELECT max col0 FROM tab0
h2(s) h3(s) h4(s) h5(s) c3 h1(s) h3
value the col_0 …
w2: largest
PBMT: MOSES Phrase-based MT system SUM-NN: Rush et al, EMNLP 2015
BLEU 5.25 10.5 15.75 21 SQL C#
IR PBMT SUM-NN CODE-NN
PBMT: MOSES Phrase-based MT system SUM-NN: Rush et al, EMNLP 2015
BLEU 5.25 10.5 15.75 21 SQL C#
IR PBMT SUM-NN CODE-NN
PBMT: MOSES Phrase-based MT system SUM-NN: Rush et al, EMNLP 2015
Naturalness 1.25 2.5 3.75 5 SQL C#
IR PBMT SUM-NN CODE-NN
PBMT: MOSES Phrase-based MT system SUM-NN: Rush et al, EMNLP 2015 Informativeness 1.25 2.5 3.75 5 SQL C#
SELECT * FROM table ORDER BY Rand() LIMIT 10
Select random rows from mysql table How to get random rows from a mysql database?
Reference CODE-NN
SELECT * FROM table ORDER BY Rand() LIMIT 10
Select random rows from mysql table How to get random rows from a mysql database?
Reference CODE-NN
foreach (string pTxt in xml.parent) { TreeNode parent = new TreeNode(); foreach (string cTxt in xml.child) { TreeNode child = new TreeNode(); parent.Nodes.Add(child ); } }
Adding childs to a treenode dynamically in C# How to get all child nodes in TreeView?
Reference CODE-NN
(Koncel-Kedziorski, Konstas, Zettlemoyer, Hajishirzi. A Theme-Rewriting Approach for Generating Algebra Word Problems, EMNLP 2016)
Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice?
+ =
Joint work with Rik Koncel-Kedziorski Luke Zettlemoyer, Hannaneh Hajishirzi
(Koncel-Kedziorski, Konstas, Zettlemoyer, Hajishirzi. A Theme-Rewriting Approach for Generating Algebra Word Problems, EMNLP 2016)
Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice?
+ =
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
Syntactic, Semantic, Thematic rewriter
Joint work with Rik Koncel-Kedziorski Luke Zettlemoyer, Hannaneh Hajishirzi
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
+ =
Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice?
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
+ =
504 + x = 639
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
+ =
504 + x = 639 504 + x = 639
theme
Luke Skywalker blasters
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
+ =
504 + x = 639 504 + x = 639
theme
Luke Skywalker blasters
sOUT:
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
+ =
504 + x = 639 504 + x = 639
theme
Luke Skywalker blasters
sOUT: sIN:
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
+ =
504 + x = 639
sG:
504 + x = 639
theme
Luke Skywalker blasters
sOUT: sIN:
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
math problem
+ =
504 + x = 639
sG:
504 + x = 639
theme
Luke Skywalker blasters
sOUT: sIN:
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
f(LMIN, LMOUT, LMG)
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
theme
Luke Skywalker blasters
sOUT:
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
LMIN
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
theme
Luke Skywalker blasters
sOUT:
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
LMIN
504 + x = 639
sIN:
Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice? Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice?
LMOUT
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
theme
Luke Skywalker blasters
sOUT:
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
LMIN
504 + x = 639
sIN:
Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice? Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice?
LMOUT
sG:
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
theme
Luke Skywalker blasters
sOUT:
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
LMIN
504 + x = 639
sIN:
Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice? Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice?
LMOUT
Defense lawyer Thomas Olsson stated it was very tragic and a failure for Swedish law and
representing had been kept in detention. The official alleged Karzai was reluctant to move against big drug lords in Karzai 's political power base. Defense lawyer Thomas Olsson stated it was very tragic and a failure for Swedish law and
representing had been kept in detention. The official alleged Karzai was reluctant to move against big drug lords in Karzai 's political power base.
LMG
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
theme
Luke Skywalker blasters
sOUT:
Luke Skywalker uses the force to open the locked door that leads to the hangar. Then Han Solo runs past the spaceship in the hangar and blasted the two droids guarding it.
LMIN
504 + x = 639
sIN:
Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice? Bob has 639 sheep. Alice has 504 sheep. How many more sheep does Bob have than Alice?
LMOUT
sG:
Luke Skywalker has 639 blasters. Leia has 504 blasters. How many more blasters does Luke Skywalker have than Leia?
f(LMIN, LMOUT, LMG)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
(Angeli et al. EMNLP 2010, Kim and Mooney COLING 2010)
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
(A Global Model for Concept-to-Text Generation. Konstas and Lapata, JAIR 2013; EMNLP 2013)
D precip1 cover1 dir [0,3] [3,13] temp1
wind_
[7,13] [13,18] [16,18] max
chance of rain
time mode time_
then becoming
[3,4] [4,7] [3,7] max
with a high
wind2_
wind2
wind3 min mean
calm to moderate
mode
northeast winds.
[13,14] [14,16] S1 S2 [0,13] [13,18]
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Encoder
Document Decoder
table
document
Document Planner
Attention
(Mei et al., NAACL 2016)
Chance of rain then becoming overcast, with a high of 45. Calm to moderate northeast winds.
time min mean max mode wind 12-3 3 5 7 wind 3-6 5 5 5 wind 6-9 5 6 7 dir 12-3 NW dir 3-6 NE dir 6-9 NE temp 12-9 40 42 45 precip 12-3 25 223 45 50 precip 3-6 15 30 50 precip 6-9 12 18 25 cover 12-3 50-75 cover c 3-6 50-75 cover 6-9
75-100
Encoder
Document Decoder
table
document
Document Planner
Attention
(Mei et al., NAACL 2016)
(Krause et al., CVPR 2017, Yatskar et al., CVPR 2016, Krishna et al., 2016)
hitting agent victim victim part tool place ballplayer baseball
bat baseball diamond wearing wearer clothing body part ballplayer red helmet head wearing wearer clothing body part ballplayer white shirt torso
(Krause et al., CVPR 2017, Yatskar et al., CVPR 2016, Krishna et al., 2016)
hitting agent victim victim part tool place ballplayer baseball
bat baseball diamond wearing wearer clothing body part ballplayer red helmet head
A baseball player is swinging a bat. He is wearing a red helmet and a white shirt. The catcher’s mitt is behind the batter.
wearing wearer clothing body part ballplayer white shirt torso
Encoder
Document Decoder
frames
document
Document Planner
Attention
Semantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
Semantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
Semantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
tell child lie that ARG0 ARG1 ARG0-ofSemantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
Graph-to-graph transformation
tell child lie that ARG0 ARG1 ARG0-of tsuku kodomo tachi sono ARG1 ARG0 ARG0-ofSemantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
Graph-to-graph transformation
tell child lie that ARG0 ARG1 ARG0-of tsuku kodomo tachi sono ARG1 ARG0 ARG0-ofSemantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
Graph-to-graph transformation
tell child lie that ARG0 ARG1 ARG0-of tsuku kodomo tachi sono ARG1 ARG0 ARG0-ofSemantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
Joint work with Michael Wayne Goodman Graph-to-graph transformation
tell child lie that ARG0 ARG1 ARG0-of tsuku kodomo tachi sono ARG1 ARG0 ARG0-ofSemantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
1) Parse to MRS from English
Joint work with Michael Wayne Goodman Graph-to-graph transformation
tell child lie that ARG0 ARG1 ARG0-of tsuku kodomo tachi sono ARG1 ARG0 ARG0-ofSemantic-based Machine Translation
(Jones et al., COLING 2012)
The children told that lie
Source
その うそ は ⼦孑供 たち が つい た sono uso-wa kodomo-tachi-ga tsui-ta that lie-TOP child-and others-NOM breathe out-PAST
Target
1) Parse to MRS from English 2) Generate Japanese from MRS
Joint work with Michael Wayne Goodman Graph-to-graph transformation
tell child lie that ARG0 ARG1 ARG0-of tsuku kodomo tachi sono ARG1 ARG0 ARG0-of(Acharya et al., INLG 2016, Rieser et al., IEEE/ACM 2014)
(Acharya et al., INLG 2016, Rieser et al., IEEE/ACM 2014)
> I would like to follow up on my speech therapy treatment.
(Acharya et al., INLG 2016, Rieser et al., IEEE/ACM 2014)
> I would like to follow up on my speech therapy treatment.
treatment follow
therapyI speech
(Acharya et al., INLG 2016, Rieser et al., IEEE/ACM 2014)
> I would like to follow up on my speech therapy treatment.
treatment follow
therapyI speech
(Acharya et al., INLG 2016, Rieser et al., IEEE/ACM 2014)
> I would like to follow up on my speech therapy treatment. Patient #3245 Log:
You were admitted for acute subcortical cerebrovascular accident. […] Verbal impairment related to communication impairment was treated with speech therapy 3 months ago. [...]
treatment follow
therapyI speech
(Acharya et al., INLG 2016, Rieser et al., IEEE/ACM 2014)
> I would like to follow up on my speech therapy treatment. Patient #3245 Log:
You were admitted for acute subcortical cerebrovascular accident. […] Verbal impairment related to communication impairment was treated with speech therapy 3 months ago. [...]
treatment follow
therapyI speech treatment therapy
impairverbal communicate
(Acharya et al., INLG 2016, Rieser et al., IEEE/ACM 2014)
> I would like to follow up on my speech therapy treatment. Patient #3245 Log:
You were admitted for acute subcortical cerebrovascular accident. […] Verbal impairment related to communication impairment was treated with speech therapy 3 months ago. [...]
treatment follow
therapyI speech treatment therapy
impairverbal communicate
see
log start 3 improve therapy
(Acharya et al., INLG 2016, Rieser et al., IEEE/ACM 2014)
> I would like to follow up on my speech therapy treatment. Patient #3245 Log:
You were admitted for acute subcortical cerebrovascular accident. […] Verbal impairment related to communication impairment was treated with speech therapy 3 months ago. [...]
< I can see in my logs, that we started improving verbal impairment due to the accident, with speech therapy 3 months ago. When would you like to book the next appointment?
treatment therapy
impairverbal communicate
see
log start 3 improve therapy
Thank You