domain adaptation for constituency parsing using partial
play

Domain Adaptation for Constituency Parsing Using Partial Annotations - PowerPoint PPT Presentation

Domain Adaptation for Constituency Parsing Using Partial Annotations Vidur Joshi Matthew Peters Mark Hopkins Constituency Parsing is Useful Textual Entailment (Bowman et al., 2016) Semantic Parsing (Hopkins et al., 2017) Sentiment Analysis


  1. Domain Adaptation for Constituency Parsing Using Partial Annotations Vidur Joshi Matthew Peters Mark Hopkins

  2. Constituency Parsing is Useful Textual Entailment (Bowman et al., 2016) Semantic Parsing (Hopkins et al., 2017) Sentiment Analysis (Socher et al., 2013) Language Modeling (Dyer et al., 2016) 2

  3. Penn Tree Bank (PTB) (Marcus et al., 1993) 40,000 annotated sentences Newswire domain 3

  4. But, Target Domains Are Diverse! Geometry Problem: In the rhombus PQRS, PR = 24 and QS = 10. Question: What's the second-most-used vowel in English? Biochemistry: Ethoxycoumarin was metabolized by isolated epidermal cells via dealkylation to 7-hydroxycoumarin ( 7-OHC ) and subsequent conjugation. 4

  5. Performance Outside Source Domain Parse geometry sentence with PTB trained parser 5

  6. Performance Outside Source Domain Parse geometry sentence with PTB trained parser 6

  7. Performance Outside Source Domain Parse geometry sentence with PTB trained parser 7

  8. How can we cheaply create high quality parsers for new domains? 8

  9. Relevant Recent Developments in NLP Contextualized word representations improve sample efficiency. (Peters et al., 2018) Span-focused models achieve state-of-the-art constituency parsing results. (Stern et al., 2017) 9

  10. Contributions Show contextual word embeddings help domain adaptation. E.g., Over 90% F1 on Brown Corpus. Adapt a parser using partial annotations. E.g., Increase correct geometry-domain parses by 23%. 10

  11. Outline Review Contextual Word Representations Partial Annotations: Definition Training Parsing as Span Classification The Span Classification Model Experiments and Results: Performance on PTB and new Domains Adapting Using Partial Annotations 11

  12. Contextualized Word Representations ELMo trained on Billion Word Corpus (Peters et al., 2018) . 12

  13. Contextualized Word Representations ELMo trained on Billion Word Corpus (Peters et al., 2018) . Improve sample efficiency 13

  14. Definition Partial Training Annotations Parsing as Span Classification The Span Classification Model 14

  15. Selectively Annotate Important Phenomena A triangle has a perimeter of 16 and one side of length 4. 15

  16. Selectively Annotate Important Phenomena A triangle has [ a perimeter of 16 ] and one side of length 4. 16

  17. Selectively Annotate Important Phenomena A triangle has [ a perimeter of 16 ] and one side of length 4. 17

  18. Selectively Annotate Important Phenomena A triangle has [ a perimeter { of 16 ] and one side of length 4 } . 18

  19. Full Versus Partial Annotation (S (NP A triangle ) (VP has (NP (NP (NP a perimeter ) (PP of 16 )) and (NP (NP one side ) (PP of (NP length 4 ))))) . ) A triangle has [ a perimeter { of 16 ] and one side of length 4 } . 19

  20. Partial Annotation Definition Partial annotation is a labeled span. A triangle has [ a perimeter of 16 ] and one side of length 4 . A triangle has [ NP a perimeter of 16 ] and one side of length 4 . A triangle has a perimeter { of 16 and one side of length 4 } . 20

  21. Why Partial Annotations? Allowing annotators to selectively annotate important phenomena, makes the process faster and simpler. (Mielens et al., 2015) 21

  22. Definition Training Parsing as Span Classification The Span Classification Model 22

  23. Objective for Full Annotation 23

  24. Objective for Partial Annotation Since we do not have a full parse, marginalize out components for which no supervision exists. 24

  25. Objective for Partial Annotation Marginalize out components for which no supervision exists. Expensive! 25

  26. One Solution: Approximation* 26 *(Mirroshandel and Nasr, 2011; Majidi and Crane, 2013, Nivre et al., 2014; Li et al., 2016)

  27. Our Solution: Parsing as Span Classification Assume probability of a parse factors into a product of probabilities. 27

  28. Our Solution: Parsing as Span Classification Assume probability of a parse factors into a product of probabilities. 28

  29. Our Solution: Parsing as Span Classification Assume probability of a parse factors into a product of probabilities. 29

  30. Our Solution: Parsing as Span Classification Assume probability of a parse factors into a product of probabilities. Objective now simplifies to: Easy if model classifies spans! 30

  31. Definition Training Parsing as Span Classification The Span Classification Model 31

  32. Parse Tree Labels All Spans* 32 *(Cross and Huang, 2016; Stern et al., 2017)

  33. Parse Tree Labels All Spans* 33 *(Cross and Huang, 2016; Stern et al., 2017)

  34. Parse Tree Labels All Spans* 34 *(Cross and Huang, 2016; Stern et al., 2017)

  35. Parse Tree Labels All Spans* 35 *(Cross and Huang, 2016; Stern et al., 2017)

  36. Parse Tree Labels All Spans* 36 *(Cross and Huang, 2016; Stern et al., 2017)

  37. Parse Tree Labels All Spans* 37 *(Cross and Huang, 2016; Stern et al., 2017)

  38. Parse Tree Labels All Spans* 38 *(Cross and Huang, 2016; Stern et al., 2017)

  39. Parse Tree Labels All Spans* 39 *(Cross and Huang, 2016; Stern et al., 2017)

  40. Parse Tree Labels All Spans* 40 *(Cross and Huang, 2016; Stern et al., 2017)

  41. Parse Tree Labels All Spans* 41 *(Cross and Huang, 2016; Stern et al., 2017)

  42. Parse Tree Labels All Spans* 42 *(Cross and Huang, 2016; Stern et al., 2017)

  43. Training on Full and Partial Annotations A partial annotation is a labeled span. ▪ A full parse labels every span in the sentence. ▪ Therefore, training on both is identical under our derived objective. 43

  44. Parsing Using Span Classification Model Find maximum using dynamic programming: 44

  45. Summary Partial annotations are labeled spans. 45

  46. Summary Partial annotations are labeled spans. Use a span classification model to parse. 46

  47. Summary Partial annotations are labeled spans. Use a span classification model to parse. Training on partial and full annotations becomes identical. 47

  48. Definition Training Parsing as Span Classification The Span Classification Model 48

  49. Model Architecture (Stern et al., 2017) She enjoys playing tennis . 49

  50. Model Architecture (Stern et al., 2017) She enjoys playing tennis . 50

  51. Model Architecture (Stern et al., 2017) LSTM LSTM LSTM LSTM LSTM LSTM LSTM LSTM LSTM LSTM She enjoys playing tennis . 51

  52. Model Architecture (Stern et al., 2017) . . . . . . . . . . . . . . . She enjoys playing tennis . 52

  53. Span Embedding (Wang and Chang, 2016; Cross and Huang, 2016; Stern et al., 2017) “enjoys playing” = - . . . . . . . . . . . . . . . She enjoys playing tennis . 53

  54. Model Architecture (Stern et al., 2017) MLP “enjoys playing” = - . . . . . . . . . . . . . . . She enjoys playing tennis . 54

  55. Differences Ours Stern et al., 2017 Objective Maximum Maximum margin likelihood on labels on trees ELMo Yes No POS Tags as Input No Yes 55

  56. Differences Ours Stern et al., 2017 Objective Maximum Maximum margin likelihood on labels on trees ELMo Yes No POS Tags as Input No Yes 56

  57. Differences Ours Stern et al., 2017 Objective Maximum Maximum margin likelihood on labels on trees ELMo Yes No POS Tags as Input No Yes 57

  58. Differences Ours Stern et al., 2017 Objective Maximum Maximum margin likelihood on labels on trees ELMo Yes No POS Tags as Input No Yes 58

  59. Experiments Performance on PTB and Learning Curve on New Domains Results Adapting Using Partial Annotations 59

  60. Performance on PTB 91.8 F1 Stern et al., 2017 +0.3 F1 94.3 F1 = +Maximum Likelihood on Labels Ours -POS tags +2.2 F1 +ELMo 60

  61. Performance on PTB 92.6 F1 94.3 F1 Effective Inference for Ours Generative Neural Parsing +1.7 F1 Over Previous SoTA* * New SoTA is 95.1 (Kitaev and Klein, ACL 2018) 61

  62. Performance on PTB Learning Curve on New Domains Adapting Using Partial Annotations 62

  63. Question Bank (Judge et al., 2006) ▪ 4,000 questions. ▪ In contrast, PTB has few questions. Who is the author of the book, ``The Iron Lady: A Biography of Margaret Thatcher''? 63

  64. Do We Need Domain Adaptation? 89.9 F1 +7.2 % F1 PTB Training on QB Number of parses from Question Bank 64

  65. How Much Data Do We Need? +6.3 % 89.9 F1 From 0 to 100 parses F1 PTB +0.9 % From 100 to 2,000 parses Number of parses from Question Bank 65

  66. How Much Data Do We Need? 89.9 F1 Not Much F1 PTB Improvements taper quickly Number of parses from Question Bank 66

  67. Performance on PTB Experiments and Learning Curve on New Domains Results Adapting Using Partial Annotations 67

  68. Geometry Problems (Seo et al., 2015) In the diagram at the right, circle O has a radius of 5, and CE = 2. Diameter AC is perpendicular to chord BD at E. What is the length of BD? Biochemistry (Nivre et al., 2007) Ethoxycoumarin was metabolized by isolated epidermal cells via dealkylation to 7-hydroxycoumarin ( 7-OHC ) and subsequent conjugation . 68

  69. Setup Annotator is a parsing expert. Sees parser output. Annotated sentences randomly split into train and dev. 69

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend