semantic roles labeling
play

Semantic Roles & Labeling LING 571 Deep Processing in NLP - PowerPoint PPT Presentation

Semantic Roles & Labeling LING 571 Deep Processing in NLP November 18, 2019 Shane Steinert-Threlkeld 1 Announcements HW7: 89.4 average Only common mistake: similarity vs. distance sim(u, v) = 1 - distance(u, v) 2 Questions


  1. Semantic Roles & Labeling LING 571 — Deep Processing in NLP November 18, 2019 
 Shane Steinert-Threlkeld 1

  2. Announcements ● HW7: 89.4 average ● Only common mistake: similarity vs. distance sim(u, v) = 1 - distance(u, v) 2

  3. Questions on HW #8 ● For the mc_similarity portion c 1 , c 2 [ sim resnik ( c 1 , c 2 ) ] ● You should use From Resnik (1999), eq. 2 wsim ( w 1 , w 2 ) = max ● The numbers in the example_output are random. No meaning to them being < 1! ● For the WSD algorithm ( mea culpa ): ● The pseudocode is confusing so: 3

  4. Alternative Resnik WSD Pseudocode Given: input word w 0 and probe words {p 1, …,p n } for p i in {p 1, …,p n } : supported_sense = null most_information = 0.0 for sense w in S ENSES ( w 0 ): for sense p in S ENSES ( p i ): lcs synset = L OWEST C OMMON S UBSUMER ( sense w , sense p ) lcs info = I NFORMATION C ONTENT ( lcs synset ) if lcs info > most_information : most_information = lcs info supported_sense = sense w increment support[ supported_sense ] by most_information 4

  5. Semantic Roles 5

  6. Semantic Analysis ● Full, deep compositional semantics ● Creates full logical form ● Links sentence meaning representation to logical world model representation ● Powerful, expressive, AI-complete ● Domain-specific slot-filling: ● Common in dialog systems, IE tasks ● Narrowly targeted to domain/task ● e.g. ORIGIN_LOC, DESTINATION_LOC, AIRLINE, … ● Often pattern-matching ● Low cost, but lacks generality, richness, etc 6

  7. Semantic Role Labeling ● Typically want to know ● Who did what to whom ● … where , when , and how ● Intermediate level: ● Shallower than full deep composition ● Abstracts away (somewhat) from surface form ● Captures general predicate-argument structure info ● Balance generality and specificity 7

  8. Examples Yesterday Tom chased Jerry Yesterday Jerry was chased by Tom Tom chased Jerry yesterday Jerry was chased yesterday by Tom ● Semantic roles: ● Chaser : Tom ● ChasedThing : Jerry ● TimeOfChasing : yesterday ● Same across all sentence forms 8

  9. Full Event Semantics ● Neo-Davidsonian Style: ● ∃ e Chasing ( e ) ∧ Chaser ( e , Tom ) ∧ ChasedThing ( e , Jerry ) ∧ TimeOfChasing ( e , Yesterday ) ● Same across all examples ● Roles: Chaser , ChasedThing , TimeOfChasing ● Specific to verb “chase” ● a.k.a. “Deep roles” 9

  10. Main Idea ● Extract the semantic roles without doing full semantic parsing ● Easier problem, but still useful for many tasks ● More data ● Better models 10

  11. Issues & Challenges ● How many roles for a language? ● Arbitrary! ● Each verb’s event structure determines sets of roles 11

  12. Issues & Challenges ● How can we acquire these roles? ● Manual construction? ● Some progress on automatic learning ● Mostly successful on limited domains (ATIS, GeoQuery) 12

  13. Issues & Challenges ● Can we capture generalities across verbs/events? ● Not really, each event/role is specific 13

  14. Thematic Roles ● Solution to instantiating a specific role for every verb ● Attempt to capture commonality between roles 14

  15. Thematic Roles ● Describe common semantic roles of verbal arguments ● e.g. subject of break is A GENT ● A GENT : volitional cause ● T HEME : things affected by action ● Enables generalization over surface order of arguments ● John A GENT broke the window T HEME ● The rock I NSTRUMENT broke the window T HEME ● The window T HEME was broken by John A GENT 15

  16. Thematic Roles ● Verbs take different roles ● The break verb could be formed as: ● A GENT /Subject, T HEME /Object ( John broke the window ) ● A GENT /Subject, T HEME /Object, I NSTRUMENT /PP with ( John broke the window with a rock ) ● I NSTRUMENT /Subject, T HEME /Object ( The rock broke the window ) ● T HEME /Subject ( The window was broken ) 16

  17. Thematic Roles ● Thematic grid, Θ -grid, case frame ● Set of thematic role arguments of verb ● subject: A GENT ; Object: T HEME , or ● subject: I NSTR ; Object:T HEME ● Verb/Diathesis Alternations ● Verbs allow different surface realizations of roles ● Doris A GENT gave the book T HEME to Carv G OAL ● Doris A GENT gave Carv G OAL the book T HEME 17

  18. Canonical Roles Thematic Role Example A GENT The waiter spilled the soup E XPERIENCER John has a headache F ORCE The wind blows debris from the mall into our yards. T HEME Only after Benjamin Franklin broke the ice … R ESULT The French government has built a regulation-size baseball diamond… C ONTENT Mona asked “You met Mary Ann at a supermarket?” I NSTRUMENT He turned to poaching catfish, stunning them with a shocking device … B ENEFICIARY Whenever Ann Callahan makes hotel reservations for her boss… S OURCE I flew in from Boston. G OAL I drove to Portland . 18

  19. Thematic Role Issues ● Hard to produce From Levin and Rappaport Hovav 2005: ● Standard set of roles a. John broke the window with a rock. b. The rock broke the window. ● Fragmentation: Often need to make more specific a. Swabha ate the banana with a fork. ● e.g. I NSTRUMENT s can be subject or not b. * The fork ate the banana. ● Standard definition of roles ● Most A GENT s: animate, volitional, sentient, causal ● But not all… e.g.? [Google] Agent found the answer. 19

  20. Thematic Role Issues ● Strategies: ● Generalized semantic roles: P ROTO -A GENT /P ROTO -P ATIENT ● Defined heuristically: PropBank ● Define roles specific to verbs/nouns: FrameNet 20

  21. PropBank ● Sentences annotated with semantic roles ● Penn and Chinese Treebank ● Roles specific to verb sense ● Numbered: Arg 0 , Arg 1 , Arg 2 , … ● Arg 0 : P ROTO -A GENT ; Arg 1 : P ROTO -P ATIENT , etc 21

  22. PropBank ● Arguments >1 are Verb-specific ● e.g. agree.01 ● Arg 0 : Agreer ● Arg 1 : Proposition ● Arg 2 : Other entity agreeing ● Ex1: [ Arg0 The group] agreed [ Arg1 it wouldn’t make an offer] 22

  23. PropBank ● Resources: ● Annotated sentences ● Started w/Penn Treebank ● Now: Google answerbank, SMS, webtext, etc ● Framesets: ● Per-sense inventories of roles, examples ● Span verbs, adjectives, nouns (e.g. event nouns) 23

  24. PropBank ● propbank.github.io ● Recent status: ● 5940 verbs w/8121 framesets ● 1880 adjectives w/2210 framesets ● Continued into OntoNotes ● [CoNLL 2005 and 2012 shared tasks] 24

  25. AMR ● “Abstract Meaning Representation” ● Sentence-level semantic representation ● Nodes: Concepts ● English words; PropBank: predicates; or keywords (‘person’) ● Edges: Relations ● PropBank thematic roles (ARG0-ARG5) ● Others including ‘location,’ ‘name,’ ‘time,’ etc… ● ~100 in total 25

  26. AMR 2 From Liu et. al (2015) ● AMR Bank: (now) ~40K annotated see-01 sentences ARG0 ARG1 ● JAMR parser: 63% F-measure (2015) i dog poss ARG0-of ● Alignments between word spans & graph fragments person run-02 ● Example: “I saw Joe’s dog, which was name location running in the garden.” name garden op1 “Joe” 26

  27. AMR 3 ● Towards full semantic parsing ● “Deeper” than base PropBank, but: ● No real quantification ● No articles ● No real vs. hypothetical events (e.g. “wants to go”) 27

  28. FrameNet (Fillmore et al) ● Key insight: ● Commonalities not just across different sentences w/same verb but across different verbs (and nouns and adjectives) ● PropBank ● [ Arg0 Big Fruit Co.] increased [ Arg1 the price of bananas]. ● [ Arg1 The price of bananas] was increased by [ Arg0 BFCo]. ● [ Arg1 The price of bananas] increased [ Arg2 5%]. ● FrameNet ● [ A TTRIBUTE The price] of [ I TEM bananas] increased [ D IFF 5%]. ● [ A TTRIBUTE The price] of [ I TEM bananas] rose [ D IFF 5%]. ● There has been a [ D IFF 5%] rise in [ A TTRIBUTE the price] of [ I TEM bananas]. 28

  29. FrameNet ● Semantic roles specific to frame ● Frame: script-like structure, roles (frame elements) ● e.g. C HANGE _P OSITION _ ON _S CALE : increase, rise ● A TTRIBUTE ; I NITIAL _V ALUE ; F INAL _V ALUE ● Core, non-core roles ● Relationships between frames, frame elements ● Add causative: C AUSE _C HANGE _P OSITION _ ON _S CALE 29

  30. Change of position on scale dwindle move soar escalation shift VERBS: advance edge mushroom swell explosion tumble climb explode plummet swing fall decline fall reach triple fluctuation ADVERBS: decrease fluctuate rise tumble gain increasingly diminish gain rocket growth dip grow shift hike NOUNS: double increase skyrocket decline increase drop jump slide decrease rise 30

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend