Incremental Semantic Role Labeling with Tree Adjoining Grammar - - PowerPoint PPT Presentation

incremental semantic role labeling with tree adjoining
SMART_READER_LITE
LIVE PREVIEW

Incremental Semantic Role Labeling with Tree Adjoining Grammar - - PowerPoint PPT Presentation

Incremental Semantic Role Labeling with Tree Adjoining Grammar Ioannis Konstas Joint work with Frank Keller, Vera Demberg and Mirella Lapata Institute for Language, Cognition and Computation University of Edinburgh 2 October 2014 Ioannis


slide-1
SLIDE 1

Incremental Semantic Role Labeling with Tree Adjoining Grammar

Ioannis Konstas

Joint work with Frank Keller, Vera Demberg and Mirella Lapata

Institute for Language, Cognition and Computation University of Edinburgh

2 October 2014

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 1 / 21

slide-2
SLIDE 2

Introduction

Human Language Processing

Human language processing is incremental: we update our parse of the input for each new word that comes in.

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 2 / 21

slide-3
SLIDE 3

Introduction

Human Language Processing

Human language processing is incremental: we update our parse of the input for each new word that comes in. Incrementality leads to local ambiguity, which we can observe in garden path sentences: (1) a. The old man the boat. b. I convinced her children are noisy.

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 2 / 21

slide-4
SLIDE 4

Introduction

Human Language Processing

Many garden paths are not due to syntactic ambiguity alone, they also involve semantic role ambiguity

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 3 / 21

slide-5
SLIDE 5

Introduction

Human Language Processing

Many garden paths are not due to syntactic ambiguity alone, they also involve semantic role ambiguity (2) The athlete realised her goals . . . a. . . . at the competition. b. . . . were out of reach. This indicates that humans incrementally assign semantic roles.

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 3 / 21

slide-6
SLIDE 6

Introduction

Human Language Processing

Many garden paths are not due to syntactic ambiguity alone, they also involve semantic role ambiguity (2) The athlete realised her goals . . . a. . . . at the competition. b. . . . were out of reach. This indicates that humans incrementally assign semantic roles. Let’s look at this example in more detail.

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 3 / 21

slide-7
SLIDE 7

Introduction

Human Language Processing - Example

The athlete realised

A0

A0,athlete,realised

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 4 / 21

slide-8
SLIDE 8

Introduction

Human Language Processing - Example

The athlete realised

A0 A1,A2,...

A0,athlete,realised [A1,A2],nil,realised

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 4 / 21

slide-9
SLIDE 9

Introduction

Human Language Processing - Example

The athlete realised her goals

A0 A1

A0,athlete,realised A1,goals,realised

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 4 / 21

slide-10
SLIDE 10

Introduction

Human Language Processing - Example

The athlete realised her goals were

  • ut
  • f

reach

A0 A1 A0

A0,athlete,realised A1,were,realised A0,goals,were

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 4 / 21

slide-11
SLIDE 11

Introduction

Incremental Semantic Role Labeling

Determine Semantic Role Labels as the input unfolds

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 5 / 21

slide-12
SLIDE 12

Introduction

Incremental Semantic Role Labeling

Determine Semantic Role Labels as the input unfolds Given a sentence prefix and its partial syntactic structure:

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 5 / 21

slide-13
SLIDE 13

Introduction

Incremental Semantic Role Labeling

Determine Semantic Role Labels as the input unfolds Given a sentence prefix and its partial syntactic structure:

1

Identify Arguments and Predicates

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 5 / 21

slide-14
SLIDE 14

Introduction

Incremental Semantic Role Labeling

Determine Semantic Role Labels as the input unfolds Given a sentence prefix and its partial syntactic structure:

1

Identify Arguments and Predicates

2

Assign correct role labels

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 5 / 21

slide-15
SLIDE 15

Introduction

Incremental Semantic Role Labeling

Determine Semantic Role Labels as the input unfolds Given a sentence prefix and its partial syntactic structure:

1

Identify Arguments and Predicates

2

Assign correct role labels

Assign incomplete semantic roles

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 5 / 21

slide-16
SLIDE 16

Introduction

Non-incremental SRL

Pipeline approach Liu and Sarkar (2007) Màrquez et al. (2008) Björkelund et al. (2009) (MATE)

Bilexical Features + Syntactic Features + Dependency Path Features Reranker Màrquez et al. (2008), Björkelund et al. (2009) Bilexical Features + Syntactic Features + Dependency Path Features + TAG Features Liu and Sarkar (2007)

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 6 / 21

slide-17
SLIDE 17

ıSRL Model

Model

Psycholinguistically Motivated TAG (PLTAG) + Semantic Role Lexicon Incremental Role Propagation Algorithm (IRPA) Identifier/ Role Label Disambiguation

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 7 / 21

slide-18
SLIDE 18

ıSRL Model

Psycholinguistically Motivated TAG (PLTAG)

Psycholinguistically Motivated TAG (PLTAG), is a variant of tree-adjoining grammar (Demberg et al., 2014):

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 8 / 21

slide-19
SLIDE 19

ıSRL Model

Psycholinguistically Motivated TAG (PLTAG)

Psycholinguistically Motivated TAG (PLTAG), is a variant of tree-adjoining grammar (Demberg et al., 2014): in standard TAG, the lexicon consists of initial trees and auxiliary trees (both are lexicalized);

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 8 / 21

slide-20
SLIDE 20

ıSRL Model

Psycholinguistically Motivated TAG (PLTAG)

Psycholinguistically Motivated TAG (PLTAG), is a variant of tree-adjoining grammar (Demberg et al., 2014): in standard TAG, the lexicon consists of initial trees and auxiliary trees (both are lexicalized); it adds unlexicalized predictive trees to achieve connectivity;

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 8 / 21

slide-21
SLIDE 21

ıSRL Model

Psycholinguistically Motivated TAG (PLTAG)

Psycholinguistically Motivated TAG (PLTAG), is a variant of tree-adjoining grammar (Demberg et al., 2014): in standard TAG, the lexicon consists of initial trees and auxiliary trees (both are lexicalized); it adds unlexicalized predictive trees to achieve connectivity; the standard TAG operations are substitution and adjunction;

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 8 / 21

slide-22
SLIDE 22

ıSRL Model

Psycholinguistically Motivated TAG (PLTAG)

Psycholinguistically Motivated TAG (PLTAG), is a variant of tree-adjoining grammar (Demberg et al., 2014): in standard TAG, the lexicon consists of initial trees and auxiliary trees (both are lexicalized); it adds unlexicalized predictive trees to achieve connectivity; the standard TAG operations are substitution and adjunction; it adds verification to verify predictive trees;

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 8 / 21

slide-23
SLIDE 23

ıSRL Model

Psycholinguistically Motivated TAG (PLTAG)

Psycholinguistically Motivated TAG (PLTAG), is a variant of tree-adjoining grammar (Demberg et al., 2014): in standard TAG, the lexicon consists of initial trees and auxiliary trees (both are lexicalized); it adds unlexicalized predictive trees to achieve connectivity; the standard TAG operations are substitution and adjunction; it adds verification to verify predictive trees; PLTAG supports parsing with incremental, fully connected structures.

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 8 / 21

slide-24
SLIDE 24

ıSRL Model

PLTAG

Lexicon: Standard TAG lexicon Predictive lexicon (PLTAG) Operations: Substitution Adjunction Verification (PLTAG)

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 9 / 21

slide-25
SLIDE 25

ıSRL Model

PLTAG

Lexicon: Standard TAG lexicon Predictive lexicon (PLTAG) Operations: Substitution Adjunction Verification (PLTAG)

Example

Initial Tree: NP Peter S NP↓ VP sleeps Auxiliary Tree: VP AP

  • ften

VP*

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 9 / 21

slide-26
SLIDE 26

ıSRL Model

PLTAG

Lexicon: Standard TAG lexicon Predictive lexicon (PLTAG) Operations: Substitution Adjunction Verification (PLTAG)

Example

NP Peter substitutes into S NP↓ VP sleeps resulting in S NP Peter VP sleeps

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 9 / 21

slide-27
SLIDE 27

ıSRL Model

PLTAG

Lexicon: Standard TAG lexicon Predictive lexicon (PLTAG) Operations: Substitution Adjunction Verification (PLTAG)

Example

VP AP

  • ften

VP* adjoins to S NP Peter VP sleeps resulting in S NP Peter VP AP

  • ften

VP sleeps

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 9 / 21

slide-28
SLIDE 28

ıSRL Model

PLTAG

Lexicon: Standard TAG lexicon Predictive lexicon (PLTAG) Operations: Substitution Adjunction Verification (PLTAG)

Example

Prediction Tree: Sk NPk↓ VPk

k

Index k marks predicted node.

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 9 / 21

slide-29
SLIDE 29

ıSRL Model

PLTAG

Lexicon: Standard TAG lexicon Predictive lexicon (PLTAG) Operations: Substitution Adjunction Verification (PLTAG)

Example

S1 NP1 Peter VP1 AP

  • ften

VP1 is verified by S NP↓ VP sleeps resulting in S NP Peter VP AP

  • ften

VP sleeps All nodes indexed with k have to be verified.

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 9 / 21

slide-30
SLIDE 30

ıSRL Model

Comparison with TAG

TAG derivations are not always incremental.

Example

NP ↓ S VP sleeps subst S VP sleeps NP Peter adj VP AP

  • ften

VP S NP Peter sleeps

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 10 / 21

slide-31
SLIDE 31

ıSRL Model

Comparison with TAG

PLTAG derivation are always incremental and fully connected.

Example

NP Peter S VP NP 1 Peter VP AP

  • ften

VP S NP Peter VP AP

  • ften

VP S NP Peter sleeps subst adj verif 1 1 1 1 1 1 1

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 10 / 21

slide-32
SLIDE 32

ıSRL Model

Semantic Roles in Lexicon

Used information for verb predicates only, derived from PropBank (Palmer, 2005)

NP NNS Banks S VP S↓ {A1} VP VBD refused NP↓ {A0,A1} S2 VP2

2

VB2

2

NP2

1

t1

1

{A0,A1,A2} VP VP∗ TO to S VP VB

  • pen

NP t {A1}

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 11 / 21

slide-33
SLIDE 33

ıSRL Model

Incremental Role Propagation Algorithm

NP NNS Banks S VP S↓ {A1} VP VBD refused NP NNS Banks {A0,A1} S VP S2{A1} VP2

2

VB2

2

NP2

1

t1

1

{A0,A1,A2} VP VBD refused NP NNS Banks {A0,A1}

  • 1. subst
  • 2. subst
  • 1. NP → {A0,A1},Banks,refused

S → A1,nil,refused

  • 2. NP → {A0,A1},Banks,refused

S → A1,S2,refused NP → {A0,A1,A2},t,nil

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 12 / 21

slide-34
SLIDE 34

ıSRL Model

Incremental Role Propagation Algorithm

NP NNS Banks S VP S↓ {A1} VP VBD refused NP NNS Banks {A0,A1} S VP S2{A1} VP2

2

VB2

2

NP2

1

t1

1

{A0,A1,A2} VP VBD refused NP NNS Banks {A0,A1}

  • 1. subst
  • 2. subst
  • 1. NP → {A0,A1},Banks,refused

S → A1,nil,refused

  • 2. NP → {A0,A1},Banks,refused

S → A1,S2,refused NP → {A0,A1,A2},t,nil

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 12 / 21

slide-35
SLIDE 35

ıSRL Model

Incremental Role Propagation Algorithm

NP NNS Banks S VP S↓ {A1} VP VBD refused NP NNS Banks {A0,A1} S VP S2{A1} VP2

2

VB2

2

NP2

1

t1

1

{A0,A1,A2} VP VBD refused NP NNS Banks {A0,A1}

  • 1. subst
  • 2. subst
  • 1. NP → {A0,A1},Banks,refused

S → A1,nil,refused

  • 2. NP → {A0,A1},Banks,refused

S → A1,S2,refused NP → {A0,A1,A2},t,nil

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 12 / 21

slide-36
SLIDE 36

ıSRL Model

Incremental Role Propagation Algorithm

NP NNS Banks S VP S↓ {A1} VP VBD refused NP NNS Banks {A0,A1} S VP S2{A1} VP2

2

VB2

2

NP2

1

t1

1

{A0,A1,A2} VP VBD refused NP NNS Banks {A0,A1}

  • 1. subst
  • 2. subst
  • 3. adj
  • 1. NP → {A0,A1},Banks,refused

S → A1,nil,refused

  • 2. NP → {A0,A1},Banks,refused

S → A1,S2,refused NP → {A0,A1,A2},t,nil

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 12 / 21

slide-37
SLIDE 37

ıSRL Model

Incremental Role Propagation Algorithm

S VP S2{A1} VP2 VP2 VB2

2

TO to NP2

1

t1

1

{A0,A1,A2} VP VBD refused NP NNS Banks {A0,A1} S VP S VP VP VB

  • pen

TO to {A1} NP t VP VBD refused NP NNS Banks {A0,A1}/{A1}

  • 4. verif
  • 3. —
  • 4. NP → {A0,A1},Banks,refused

S → A1,to,refused NP → A1,Banks,open

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 12 / 21

slide-38
SLIDE 38

ıSRL Model

Incremental Role Propagation Algorithm

S VP S2{A1} VP2 VP2 VB2

2

TO to NP2

1

t1

1

{A0,A1,A2} VP VBD refused NP NNS Banks {A0,A1} S VP S VP VP VB

  • pen

TO to {A1} NP t VP VBD refused NP NNS Banks {A0,A1}/{A1}

  • 4. verif
  • 3. —
  • 4. NP → {A0,A1},Banks,refused

S → A1,to,refused NP → A1,Banks,open

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 12 / 21

slide-39
SLIDE 39

ıSRL Model

Argument Identification - Role Label Disambiguation

Argument Identification

{A0,A1},Banks,refused Bilexical Features Syntactic Features L2-loss support vector classifier Keep Discard

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 13 / 21

slide-40
SLIDE 40

ıSRL Model

Argument Identification - Role Label Disambiguation

Argument Identification

{A0,A1},Banks,refused Bilexical Features Syntactic Features L2-loss support vector classifier Keep Discard

Role Label Disambiguation

{A0,A1},Banks,refused Bilexical Features Syntactic Features L2-regularised logistic regres- sion classifier {A0},Banks,refused

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 13 / 21

slide-41
SLIDE 41

Experiments

Experiments

Train PLTAG on sections WSJ 02-21 (79.41% F1) Train classifiers on CoNLL 2009 (Ident.: 92.18%, Lab.: 82.37%) Gold lexicon entries during parsing - CoNLL-SRL-only task

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 14 / 21

slide-42
SLIDE 42

Experiments

Experiments

Train PLTAG on sections WSJ 02-21 (79.41% F1) Train classifiers on CoNLL 2009 (Ident.: 92.18%, Lab.: 82.37%) Gold lexicon entries during parsing - CoNLL-SRL-only task Evaluation Full sentence Accuracy (F1) Unlabelled Prediction Score (UPS) Combined Incremental SRL Score (CISS)

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 14 / 21

slide-43
SLIDE 43

Experiments

Experiments

Train PLTAG on sections WSJ 02-21 (79.41% F1) Train classifiers on CoNLL 2009 (Ident.: 92.18%, Lab.: 82.37%) Gold lexicon entries during parsing - CoNLL-SRL-only task Evaluation Full sentence Accuracy (F1) Unlabelled Prediction Score (UPS) Combined Incremental SRL Score (CISS) System Comparison ıSRL -Oracle ıSRL Majority-Baseline Malt-Baseline

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 14 / 21

slide-44
SLIDE 44

Experiments

Results - Full sentence

ıSRL-Oracle ıSRL Major-Baseline Malt-Baseline 20 40 60 80 100 85.29 78.38 63.92 52.5 F1

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 15 / 21

slide-45
SLIDE 45

Experiments

Results - Incremental

5 10 15 20 0.2 0.4 0.6 0.8 1 words F1 Unlabelled Argument Score (UAS) F1 iSRL-Oracle iSRL Majority-Baseline Malt-Baseline

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 16 / 21

slide-46
SLIDE 46

Experiments

Results - Incremental

5 10 15 20 0.2 0.4 0.6 0.8 1 words F1 Combined Incremental SRL Score (CISS) F1 iSRL-Oracle iSRL Majority-Baseline Malt-Baseline

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 16 / 21

slide-47
SLIDE 47

Conclusions

Conclusions

New task of Incremental Semantic Role Labeling Our system combines:

Psycholinguistically Motivated TAG (PLTAG) Semantic Role Lexicon Incremental Role Propagation Algorithm (IRPA) Argument Identification, Role Disambiguation Classifiers

Outperforms baselines Performs well incrementally: predicts (in)-complete triples early in the sentence

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 17 / 21

slide-48
SLIDE 48

Next Steps

Fusing Syntax with Semantics

Use ıSRL labels as pivotal points and score with model of semantics PLTAG Parser Reranker

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 18 / 21

slide-49
SLIDE 49

Next Steps

Fusing Syntax with Semantics

Use ıSRL labels as pivotal points and score with model of semantics PLTAG Parser Reranker

Banks refused to

  • pen

y∗ f (d∗

1)×α

f (d∗

2)×α

f (d∗

3)×α

f (d∗

4)×α

ˆ y

      

f (d11) × α f (d21) × α f (d31) × α f (d41) × α → ˆ y1 f (d51) × α

             

f (d12) × α f (d22) × α → ˆ y2 f (d32) × α f (d42) × α f (d52) × α

             

f (d13) × α f (d23) × α → ˆ y3 f (d33) × α f (d43) × α f (d53) × α

             

f (d14) × α f (d24) × α f (d34) × α → ˆ y4 f (d44) × α f (d54) × α

       α ← α+f (d∗

1 )−f (d41)

α ← α+f (d∗

2 )−f (d22)

α ← α+f (d∗

3 )−f (d23)

α ← α+f (d∗

4 )−f (d34)

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 18 / 21

slide-50
SLIDE 50

Next Steps

Fusing Syntax with Semantics

Use ıSRL labels as pivotal points and score with model of semantics PLTAG Parser Reranker

Banks refused to

  • pen

y∗ f (d∗

1)×α

f (d∗

2)×α

f (d∗

3)×α

f (d∗

4)×α

ˆ y

      

f (d11) × α f (d21) × α f (d31) × α f (d41) × α → ˆ y1 f (d51) × α

             

f (d12) × α f (d22) × α → ˆ y2 f (d32) × α f (d42) × α f (d52) × α

             

f (d13) × α f (d23) × α → ˆ y3 f (d33) × α f (d43) × α f (d53) × α

             

f (d14) × α f (d24) × α f (d34) × α → ˆ y4 f (d44) × α f (d54) × α

       α ← α+f (d∗

1 )−f (d41)

α ← α+f (d∗

2 )−f (d22)

α ← α+f (d∗

3 )−f (d23)

α ← α+f (d∗

4 )−f (d34)

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 18 / 21

slide-51
SLIDE 51

Next Steps

Fusing Syntax with Semantics

Use ıSRL labels as pivotal points and score with model of semantics PLTAG Parser Reranker

Banks refused to

  • pen

y∗ f (d∗

1)×α

f (d∗

2)×α

f (d∗

3)×α

f (d∗

4)×α

ˆ y

      

f (d11) × α f (d21) × α f (d31) × α f (d41) × α → ˆ y1 f (d51) × α

             

f (d12) × α f (d22) × α → ˆ y2 f (d32) × α f (d42) × α f (d52) × α

             

f (d13) × α f (d23) × α → ˆ y3 f (d33) × α f (d43) × α f (d53) × α

             

f (d14) × α f (d24) × α f (d34) × α → ˆ y4 f (d44) × α f (d54) × α

       α ← α+f (d∗

1 )−f (d41)

α ← α+f (d∗

2 )−f (d22)

α ← α+f (d∗

3 )−f (d23)

α ← α+f (d∗

4 )−f (d34)

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 18 / 21

slide-52
SLIDE 52

Next Steps

Fusing Syntax with Semantics

Use ıSRL labels as pivotal points and score with model of semantics PLTAG Parser Reranker

Banks refused to

  • pen

y∗ f (d∗

1)×α

f (d∗

2)×α

f (d∗

3)×α

f (d∗

4)×α

ˆ y

      

f (d11) × α f (d21) × α f (d31) × α f (d41) × α → ˆ y1 f (d51) × α

             

f (d12) × α f (d22) × α → ˆ y2 f (d32) × α f (d42) × α f (d52) × α

             

f (d13) × α f (d23) × α → ˆ y3 f (d33) × α f (d43) × α f (d53) × α

             

f (d14) × α f (d24) × α f (d34) × α → ˆ y4 f (d44) × α f (d54) × α

       α ← α+f (d∗

1 )−f (d41)

α ← α+f (d∗

2 )−f (d22)

α ← α+f (d∗

3 )−f (d23)

α ← α+f (d∗

4 )−f (d34)

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 18 / 21

slide-53
SLIDE 53

Next Steps

Features

Baseline PLTAG probability model score Syntactic Features

Current lexicon entry Previous lexicon entry Bigram lexicon entries Unlexicalised features

Current SRL triple(s) Semantic Score

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 19 / 21

slide-54
SLIDE 54

Next Steps

Features

Baseline PLTAG probability model score Syntactic Features

Current lexicon entry Previous lexicon entry Bigram lexicon entries Unlexicalised features

Current SRL triple(s) Semantic Score

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 19 / 21

slide-55
SLIDE 55

Next Steps

Semantic Score

Blacoe and Lapata, 2013: CDT model trained using SRL instead of dependencies Sayeed and Demberg, ongoing: Baroni and Lenci, 2010 -inspired also trained using SRL instead of dependencies Baselines (No syntax)

Mikolov et al., 2013 Mitchell and Lapata, 2010

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 20 / 21

slide-56
SLIDE 56

Next Steps

Semantic Score

Blacoe and Lapata, 2013: CDT model trained using SRL instead of dependencies Sayeed and Demberg, ongoing: Baroni and Lenci, 2010 -inspired also trained using SRL instead of dependencies Baselines (No syntax)

Mikolov et al., 2013 Mitchell and Lapata, 2010

Multiple Triples (vary composition function)

The temperature will be taken from him

A1 AM-MOD A2

A1,temperature,taken AM-MOD,will,taken A2,him,taken

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 20 / 21

slide-57
SLIDE 57

Next Steps

Thank you

Ioannis Konstas (ILCC) ıSRL with PLTAG 2 October 2014 21 / 21