Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis - - PowerPoint PPT Presentation

recognizing contextual polarity in phrase level sentiment
SMART_READER_LITE
LIVE PREVIEW

Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis - - PowerPoint PPT Presentation

Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis Presented by Kay Lu, Ashley Gao, Qusheng Sun Introduction of Sentiment Analysis Definition A task of identifying positive and negative opinions, emotions, and evaluations.


slide-1
SLIDE 1

Recognizing Contextual Polarity in Phrase-Level Sentiment Analysis

Presented by Kay Lu, Ashley Gao, Qusheng Sun

slide-2
SLIDE 2

Introduction of Sentiment Analysis Definition

A task of identifying positive and negative opinions, emotions, and evaluations.

Prior Polarity

Out of context, positive or negative.

Contextual Polarity

Phrase in which a word appears may be different from the word’s prior polarity.

slide-3
SLIDE 3

Example

Prior Polarity Contextual Polarity

Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable. Philip Clap, President of the National Environment Trust, sums up well the general thrust of the reaction of environmental movements: there is no reason at all to believe that the polluters are suddenly going to become reasonable.

slide-4
SLIDE 4

Mannal Annotation

MPQA

  • a tool provides corpus contains news articles and other text documents

manually annotated for opinion, sentiment, etc. Human annotation

  • Based on the MPQA, anotators were instructed to tag the polarity of

subjective expressions as positive, negative, both, or neutral

slide-5
SLIDE 5

Agreement Study && Corpus

Agreement study

  • measure reliability of the polarity

annotation, which compares 2 annotations. Corpus

  • divide annotated documents to get

experiment data

slide-6
SLIDE 6

Prior-Polarity Subjectivity Lexicon

All words in lexicon tagged with:

  • Prior polarity: positive, negative, both, neutral
  • Reliability: strongly subjective and weakly subjective.
slide-7
SLIDE 7

Experiment - 2 Steps

1. Use AdaBoost on both parts 2. Use 10 fold cross validation when testing 3. Step 1: Find polar phrases 4. Step 2: Disambiguation

slide-8
SLIDE 8

Step 1 - Classifier of Clue Instances

Given an instance inst from the lexicon, the classifier of inst is defined as: if inst not in a subjective expression: goldclass(inst) = neutral else if inst in at least one positive and one negative subjective expression: goldclass(inst) = both else if inst in a mixture of negative and neutral: goldclass(inst) = negative else if inst in a mixture of positive and neutral: goldclass(inst) = positive else: goldclass(inst) = contextual polarity of subjective expression

slide-9
SLIDE 9

Step 1 - Features

1.Word features (e.g. outrageous ) Token, Part of Speech, Context, Prior Polarity, Reliability “Outrageous crimes against humanity.” 2.Modification features (Binary fields)

  • 1. Preceded by adjective, adverb (other than not), intensifier?
  • 2. Self intensifier?
  • 3. Modifies strongsubj clue and/or weaksubj clue?
  • 4. Modified by strongsubj clue and/or weaksubj clue?
slide-10
SLIDE 10

Step 1 - Features (Dependency Tree)

slide-11
SLIDE 11

Step 1 - Features (Cont’d )

3.Structure features In subject (Human Rights) in copular (She’s right) in passive voice (must be right) 4.Sentence features Count of strongsubj clues in previous, current, next sentence Count of weaksubj clues in previous, current, next sentence Counts of various parts of speech 5.Document feature

slide-12
SLIDE 12

Step 2: Polarity Classification

Note that Step 1 is automatic Remain noise: some neutrals do in fact get passed onto Step 2 For this step, the polarity classification task remains four-way: (1) Positive (2) Negative (3) Both (4) Neutral

slide-13
SLIDE 13

Step 2: Polarity Classification

slide-14
SLIDE 14

Step 2: Polarity Classification

Binary features:

Negated:

1. not good 2. does not look very good 3. not only good but amazing

Negated subject:

No politically prudent Israeli could support either of them

slide-15
SLIDE 15

Step 2: Polarity Classification

  • Modifies polarity

5 values: positive, negative, neutral, both, not mod substantial: negative

  • Modified by polarity

5 values: positive, negative, neutral, both, not mod challenge: positive

substantial (pos) challenge (neg)

  • Conjunction polarity

5 values: positive, negative, neutral, both, not mod good: negative

good (pos) and evil (neg)

slide-16
SLIDE 16

Step 2: Polarity Classification

  • General polarity shifter

pose little threat contains little truth

  • Negative polarity shifter

lack of understanding

  • Positive polarity shifter

abate the damage

slide-17
SLIDE 17

Results of Step 2: Polarity Classification

  • Classifier using all ten features

significantly outperforms the two baseline classifier

  • Combination of features is needed

to achieve significant results over baseline

slide-18
SLIDE 18

Conclusions

Presented a two-step approach to phrase-level sentiment analysis

(1) Determine if an expression is neutral or polar (2) Determines contextual polarity of the ones that are polar

Automatically identify the contextual polarity of a large subset of sentiment expression

slide-19
SLIDE 19

Thank you!

Q&A

Works Cited: [1] Theresa Wilson, Janyce Wiebe, and Paul Hoffmann. 2005. Recognizing contextual polarity in phrase-level sentiment analysis. In Proceedings of the conference on Human Language Technology and Empirical Methods in Natural Language Processing (HLT '05). Association for Computational Linguistics, Stroudsburg, PA, USA, 347-354. DOI: https://doi.org/10.3115/1220575.1220619 [2] Theresa Wilson, Janyce Wiebe, and Paul Hoffmann. https://www.slideserve.com/brendy/recognizing-contextual-polarity-in-phrase-level-sentiment-analysis