Deborah A. Dahl Conversational Technologies Chair, W3C Multimodal - - PowerPoint PPT Presentation

deborah a dahl
SMART_READER_LITE
LIVE PREVIEW

Deborah A. Dahl Conversational Technologies Chair, W3C Multimodal - - PowerPoint PPT Presentation

Natural Language Processing for Sentiment Analysis Using the MMI Architecture Get Smart: Smart Homes, Cars, Devices and the Web W3C Workshop on Rich Multimodal Application Development 22-23 July 2013, New York Metropolitan Area, USA Deborah A.


slide-1
SLIDE 1

Natural Language Processing for Sentiment Analysis Using the MMI Architecture

Get Smart: Smart Homes, Cars, Devices and the Web W3C Workshop on Rich Multimodal Application Development 22-23 July 2013, New York Metropolitan Area, USA

Deborah A. Dahl

Conversational Technologies Chair, W3C Multimodal Interaction Working Group

slide-2
SLIDE 2

Sentiment Analysis

  • Analyzing text to determine subjective

information, such as the writer’s emotional state or attitude toward the topic of the text

  • Useful for analyzing product reviews and

social media posts

slide-3
SLIDE 3

Demo: Classifying emotions expressed in text

  • Service receives text from a client (web page,

app, another web service)

  • Classifies the text as expressing a specific

emotion – happiness, sadness, boredom, etc., from the set of 17 “everyday categories” (Cowie et al., 1999)

slide-4
SLIDE 4

Multimodal Standards for Sentiment Analysis

  • EmotionML – represents emotion
  • EMMA – represents text interpretation
  • MMI Architecture – defines communication

process among components

slide-5
SLIDE 5

Roles of Client and Web Service

  • Client

– Captures text to be classified – Sends it to the web service for classification – Receives results – Provides feedback to the user about the results

  • Service

– Receives request to analyze a text – Classifies emotion of the text – Returns result to client

slide-6
SLIDE 6

Demo System

Client Web Page

HTML

  • Graphical display
  • Capture of user input

Javascript

EMMA

  • build EMMA documents from text
  • interpret returned EMMA documents

MMI Architecture

  • send MMI events to server over HTTP/Ajax
  • poll for results
  • receive events from server

EmotionML

  • interpret EmotionML documents

Interaction Manager

  • application-specific
  • controls HTML
  • displays results

Server

Receives MMI event

  • Sends “StartResponse”
  • Classifies text into everyday emotions
  • Packages result into EmotionML + EMMA
  • Creates MMI “DoneNotification” event
  • Returns event to client

Natural language modality component (RESTful Web Service on Amazon Cloud) Java

slide-7
SLIDE 7

More Information

  • EmotionML: http://www.w3.org/TR/emotionml/
  • EmotionML vocabularies: http://www.w3.org/TR/emotion-

voc/

  • EMMA: http://www.w3.org/TR/emma/
  • MMI Architecture: http://www.w3.org/TR/mmi-arch/
  • Demo (alpha): http://nlportal.elasticbeanstalk.amazon.com

(Works best on Firefox or Chrome)

slide-8
SLIDE 8

Demo Display