towards bootstrapping a polarity shifter lexicon using
play

Towards Bootstrapping a Polarity Shifter Lexicon using Linguistic - PowerPoint PPT Presentation

International Joint Conference on Natural Language Processing November 29, 2017 Towards Bootstrapping a Polarity Shifter Lexicon using Linguistic Features Marc Schulder Michael Wiegand Josef Ruppenhofer Benjamin Roth Spoken Language Institute


  1. International Joint Conference on Natural Language Processing November 29, 2017 Towards Bootstrapping a Polarity Shifter Lexicon using Linguistic Features Marc Schulder Michael Wiegand Josef Ruppenhofer Benjamin Roth Spoken Language Institute for German Center for Information and Systems Language Language Processing Saarland University Mannheim LMU Munich

  2. What are Polarity Shifters? Shifters , like negation words, move the polarity of a phrase towards the opposite of the polar term they contain. Negation 
 Peter [did not [pass] + the exam] - . 
 They [did not [destroy] - the temple] + . Verbal Shifter 
 Peter [ failed to [pass] + the exam] - . 
 They [ failed to [destroy] - the temple] + . Saarland University Marc Schulder 2

  3. Overview • Motivation • Bootstrapping a Lexicon • Features • Classification • Output Verification • Extrinsic Evaluation • Conclusion Saarland University Marc Schulder 3

  4. Negation VS Verbal Shifters Negation Verbal Shifters Word Type Function words Content Words Saarland University Marc Schulder 4

  5. Negation VS Verbal Shifters Negation Verbal Shifters Word Type Function words Content Words Large Vocabulary Size Small (15% of verbs) Saarland University Marc Schulder 4

  6. Negation VS Verbal Shifters Negation Verbal Shifters Word Type Function words Content Words Large Vocabulary Size Small (15% of verbs) Individual High Low Frequency Saarland University Marc Schulder 4

  7. Negation VS Verbal Shifters Negation Verbal Shifters Word Type Function words Content Words Large Vocabulary Size Small (15% of verbs) Individual High Low Frequency Full Coverage Yes No Saarland University Marc Schulder 4

  8. Negation VS Verbal Shifters Negation Verbal Shifters Word Type Function words Content Words Large Vocabulary Size Small (15% of verbs) Individual High Low Frequency Full Coverage Yes No Saarland University Marc Schulder 4

  9. Pipeline Bootstrapping Large 
 Shifter 
 Lexicon Saarland University Marc Schulder 5

  10. Pipeline Expensive Bootstrapping Large 
 Shifter 
 Lexicon Saarland University Marc Schulder 5

  11. Pipeline Bootstrapping Large 
 Base 
 Shifter 
 Shifter 
 Lexicon Lexicon Saarland University Marc Schulder 5

  12. Pipeline Bootstrapping Large 
 Base 
 Shifter 
 Shifter 
 Lexicon Lexicon Saarland University Marc Schulder 6

  13. Pipeline Bootstrapping Large 
 Base 
 Shifter 
 Classifier Shifter 
 Lexicon Lexicon (labelled) Saarland University Marc Schulder 7

  14. Pipeline Bootstrapping Large 
 Base 
 Shifter 
 Features Classifier Shifter 
 Lexicon Lexicon (labelled) Saarland University Marc Schulder 7

  15. Pipeline Bootstrapping Large 
 Base 
 Shifter 
 Features Classifier Shifter 
 Lexicon Lexicon (labelled) WordNet 
 verbs 
 (unlabelled) Saarland University Marc Schulder 7

  16. Pipeline Bootstrapping Large 
 Base 
 Verify Shifter 
 Features Classifier Shifter 
 Shifters Lexicon Lexicon (labelled) WordNet 
 verbs 
 (unlabelled) Saarland University Marc Schulder 8

  17. Pipeline Bootstrapping Large 
 Base 
 Verify Shifter 
 Features Classifier Shifter 
 Shifters Lexicon Lexicon (labelled) WordNet 
 verbs 
 (unlabelled) Saarland University Marc Schulder 9

  18. Generic Features WordNet • Glosses: Word definition (bag-of-words). • Hypernyms: Words with more general meaning. • Supersenses: Coarse semantic categories. FrameNet • Verb Frames: Semantic verb groups. 
 Frame AVOIDING: desist, dodge, evade, shun, shirk,... Saarland University Marc Schulder 10

  19. Task-specific Features 1. Distributional Similarity 
 Choose verbs similar to negation words like not , no , etc. 2. Polarity Clash 
 Negative verb with positive object. 
 She [ lost [hope] + ] - . 3. Particle Verbs 
 Some particles indicate "loss" (e.g. aside, down, o ff ,... ). 
 Please [ lay aside all your [worries] - ] + . 4. any- Heuristic 
 The word any co-occurs with negation/shifters. 
 They did [ not give us any [help] + ] - . 
 Best They [ denied us any [help] + ] - . Saarland University Marc Schulder 11

  20. 
 Anti-Shifter Feature Anti-Shifter: Co-occurrence with adverbs that are • attracted to verbs of creation; • repelled by verbs of destruction. Black bears exclusively live anti-shifter on fish. Keyboards on phones were first introduced anti-shifter in 1997. These buildings have been newly constructed anti-shifter . They specially prepared anti-shifter vegan dishes for me. Saarland University Marc Schulder 12

  21. Pipeline Bootstrapping 2000 verbs Large 
 Base 
 Verify Shifter 
 Features Classifier Shifter 
 Shifters Lexicon Lexicon 304 shifters (labelled) 8581 verbs WordNet 
 verbs 
 (unlabelled) Saarland University Marc Schulder 13

  22. Classifier Setup SVM • Training: Base Lexicon 
 2000 verbs, incl. 304 shifters • Labels : Shifter, non-shifter • Evaluation: 10-fold cross validation Saarland University Marc Schulder 14

  23. Classifier Setup SVM • Training: Base Lexicon 
 2000 verbs, incl. 304 shifters • Labels : Shifter, non-shifter • Evaluation: 10-fold cross validation Baselines • Majority Label: All verbs are non-shifters Saarland University Marc Schulder 14

  24. Classifier Setup SVM • Training: Base Lexicon 
 2000 verbs, incl. 304 shifters • Labels : Shifter, non-shifter • Evaluation: 10-fold cross validation Baselines • Majority Label: All verbs are non-shifters • Graph Clustering (Approach with no labelled training data) • Input: Word Embedding Graph + Seeds • Positive Seeds: ANY (best shifter feature) • Negative Seeds: ANTI Saarland University Marc Schulder 14

  25. Classifier Performance 1.00 0.79 0.75 Macro F1 0.62 0.50 0.46 0.25 Majority Graph Clustering SVM Saarland University Marc Schulder 15

  26. Pipeline Bootstrapping 2000 verbs Large 
 Base 
 1043 shifters Verify Shifter 
 Features Classifier Shifter 
 Shifters Lexicon Lexicon 304 shifters (labelled) 8581 verbs WordNet 
 verbs 
 (unlabelled) Saarland University Marc Schulder 16

  27. Shifter Verification • Task: Human annotator verifies predicted shifters. • Input: 1043 verbs predicted as shifters. • Output: 676 verbs confirmed as shifters. 1.00 0.93 0.73 0.67 Precision 0.62 0.33 0.33 0.00 1-250 251-500 501-750 751-1043 Classifier Confidence Ranking Saarland University Marc Schulder 17

  28. Pipeline Bootstrapping 2000 verbs 304 shifters 676 shifters Large 
 Base 
 Verify Shifter 
 Features Classifier Shifter 
 Shifters Lexicon Lexicon (labelled) 8581 verbs WordNet 
 verbs 
 (unlabelled) Saarland University Marc Schulder 18

  29. Pipeline Bootstrapping 2000 verbs 304 shifters 676 shifters Large 
 Base 
 Verify Shifter 
 Features Classifier Shifter 
 Shifters Lexicon Lexicon 304 + 676 = (labelled) 980 shifters 8581 verbs WordNet 
 verbs 
 (unlabelled) Saarland University Marc Schulder 18

  30. Pipeline Bootstrapping 2000 verbs 304 shifters 676 shifters Large 
 Base 
 Verify Shifter 
 Features Classifier Shifter 
 Shifters Lexicon Lexicon 304 + 676 = (labelled) 980 shifters 8581 verbs Fine-grained WordNet 
 Sentiment verbs 
 Analysis (unlabelled) Saarland University Marc Schulder 18

  31. Extrinsic Evaluation Sentiment Analysis Task: Given a verb phrase with a polar noun, decide whether phrase polarity has shifted from the polarity of the noun. Input: Norah Jones’ smooth voice could 
 [soothe V any savage [beast N ] - ] ? . VP Output Labels: Shifted, not shifted Gold Data: Amazon Product Review Corpus (Jindal and Liu, 2008) 2631 phrases 
 Balanced for ratio of shifters among verbs. Saarland University Marc Schulder 19

  32. Extrinsic Evaluation Classifiers Proposed Classifier using Bootstrapped Lexicon • If verb in shifter lexicon ⇒ Shifted Saarland University Marc Schulder 20

  33. Extrinsic Evaluation Classifiers Proposed Classifier using Bootstrapped Lexicon • If verb in shifter lexicon ⇒ Shifted Baselines • Majority Label: All sentences are not shifted. Saarland University Marc Schulder 20

  34. Extrinsic Evaluation Classifiers Proposed Classifier using Bootstrapped Lexicon • If verb in shifter lexicon ⇒ Shifted Baselines • Majority Label: All sentences are not shifted. • Recursive Neural Tensor Network (Socher et al., 2013) • Compositional sentence-level polarity classifier. • Provides polarities for each constituency tree node. • No explicit knowledge of shifters. Saarland University Marc Schulder 20

  35. Extrinsic Evaluation Results 1.00 0.8 0.75 Macro F1 0.50 0.51 0.44 0.25 Majority RNTN LEX Saarland University Marc Schulder 21

  36. Conclusion • Produced a large lexicon of 980 shifters. 
 Available at https://github.com/marcschulder/ijcnlp2017 Saarland University Marc Schulder 22

  37. Conclusion • Produced a large lexicon of 980 shifters. 
 Available at https://github.com/marcschulder/ijcnlp2017 • Bootstrapping reduces cost of high quality annotation. Saarland University Marc Schulder 22

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend