contact less based accelerometers
play

Contact-less Based Accelerometers Speaker: Refael Shamir Founder - PowerPoint PPT Presentation

March 26 29, 2018 | Silicon Valley | #GTC18 www.gputechconf.com Affective Categorization Using Contact-less Based Accelerometers Speaker: Refael Shamir Founder and CEO of Letos Presentation Outline Motivation Driver monitoring in


  1. March 26 — 29, 2018 | Silicon Valley | #GTC18 www.gputechconf.com Affective Categorization Using Contact-less Based Accelerometers Speaker: Refael Shamir Founder and CEO of Letos

  2. Presentation Outline Motivation – Driver monitoring in the new age Background and Definitions – First steps to understanding Affect Categorization Technology Review – Facial Expressions – Eye Tracking – Voice Recognition – Wearable Monitoring – EEG, ECG, GSR, PPG… – Sentiment Analysis Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

  3. Misconception?

  4. Presentation Outline Motivation – Driver monitoring in the new age Background and Definitions – First steps to understanding Affect Categorization Technology Review – Facial Expressions – Eye Tracking – Voice Recognition – Wearable Monitoring – EEG, ECG, GSR, PPG… – Sentiment Analysis Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

  5. Motivation

  6. Motivation ‒ There is a growing debate on the tracking of in-cabin monitoring (e.g. tracking alertness) ‒ Using gaze estimation is only part of the solution ‒ Keeping eyes on the road, does not proclaim alertness level with a good confidence ‒ Need to track engagement level of the driver at all times

  7. Presentation Outline Motivation – Driver monitoring in the new age Background and Definitions – First steps to understanding Affect Categorization Technology Review – Facial Expressions – Eye Tracking – Voice Recognition – Wearable Monitoring – EEG, ECG, GSR, PPG… – Sentiment Analysis Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

  8. Background and Definitions

  9. Background and Definitions – Cnt’d Affective Computing: Picard first introduced the term “ affective computing ” in 1995, as a mean to evaluate different emotions, or expressions, from a computer perspective Affective Computing (MIT Press); Rosalind W. Picard, 1997

  10. The Dimensional Affective State Model High JOY I FEAR IV SADNESS IV I III ??? ANGER IV Negative Positive SURPRISE DISGUST III II RELIEF II Low Arousal (Y-axis) – Indicates Valence (X-axis) – Indicates excitement/engagement level pleasure/comfort level

  11. Presentation Outline Motivation – Driver monitoring in the new age Background and Definitions – First steps to understanding Affect Categorization Technology Review – Facial Expressions – Eye Tracking – Voice Recognition – Wearable Monitoring – EEG, ECG, GSR, PPG… – Sentiment Analysis Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

  12. Tools for Evaluating Affective States

  13. Suggested Prototype for Auto-Classification Input Pre- Source Feature Post- Processing Extraction Processing & Object Detection Feature Selection Output Training Classifier Model

  14. Ekman’s Model Paul Ekman argues that there are 6 basic facial expressions which are uniquely distinguished from one another, that have a relationship with an emotional state (Ekman, 1972) These set of emotions, according to Ekman, are being expressed across humans, regardless of age, gender, race, or culture

  15. Facial Expressions 36 21

  16. Feature Extraction for Affect Classification Geometric Features Appearance Based – Detecting the face (shape/size) – Detecting the face (shape/size) – Texture layering (filters) – Detect cue points (lips, eyebrow) → Categorize emotions based on → Categorize emotions based on relative position to the face extracted feature type

  17. Technology Analysis Capturing the duration of the emotion 0.5 – 4 seconds Positive and negative differentiation Confidence Negative Positive Valence Spontaneous reaction (faking?) Can still be solved Evaluating intensity level (arousal) using a camera!

  18. Human Eye Eyebrows Pupil Eyelashes Iris Eyelids Sclera

  19. How Sherlock Does It https://www.youtube.com/v/- bBHT158E0s?start=107&end=128

  20. Pupillary Response - Explained During rest, the eye’s pupil usually constricts, 2 mm due to parasympathetic activity When presenting a stimuli, the eye’s pupil 8 mm tends to dilate, due to sympathetic activity (Bradley, Miccoli, Escrig, & Lang, 2008)

  21. References and Further Reading 1.Li and Jain. Handbook of Face Recognition, 2 nd Edition. New York: Springer, 2011. Print 2.R. Cowie, E. Douglas Cowie, N. Tsapatsoulis, S. Kollias, W. Fellenz, and J. G. Taylor. “Emotion Recognition in Human Computer Interaction” . IEEE Signal Processing Magazine, 18 (2001): 32-80. Print 3.Open source tool: “Free and open source face recognition with deep neural networks” . OpenFace. GitHub (accessed February, 2018)

  22. Human Speech – Overview ‒ Speech is basically a stream of words spoken in a particular way ‒ In order to differentiate between different syllables the vocal cords vibrates, and sound is sequentially being filtered through the mouth and nose ‒ In general, speech is carried over an anchor frequency (which varies within different scenarios). This is often abbreviated as ‘F0’ https://www.youtube.com/watch?v=yxxRAHVtafI

  23. Speech Recognition - Background Human speech can be modeled through differentiating what is being transmitted during the message, and what is its intended affect “That’s so funny” “This upsets me” “I’m so happy” Primary Secondary

  24. Feature Extraction for Affect Classification Pitch: Usually, compared to the Voice/Volume Level: base frequency F0 Higher levels might indicate anger or fear Speaking rate: Can indicate speaker’s Generally, voice indicates confidence level merely arousal level

  25. Speech to Text Source: https://hacks.mozilla.org/2017/11/a-journey-to-10-word-error-rate/

  26. prosp sperity road ntrepreneur organization on com omple lexity beginnin begin ing com ompetit itor adversity ad ty colle llect ctive navigation une nemployment int nteracti ction stomers hip ctive com omple lexity Ent artnershi professi ssional ing elective erspecti inf nformatio ion ins nspir ire com omple lexity smilin ctive orking copy ssion Busi usiness ss wom omen stomers enterprise se cust smilin ing effect fi financia ial ambi am bitio ious part pers depressi wor tech te chnolo ology suc uccessful ef na sel ded edic icati tion meaning cust Busi usiness ss man man en loyment investment ce stence ction orporation emplo ersist connect pers em corp mar arket mar arket computa tati tion de development fident contemporary ctive as assi sist stance ce women communicati tion mar arket corporation det etermina nation confi asso as soci ciate opportunity tract llenge ctive ifferent individ idual tract attr at chall idual attr conference ce diff llenge ctive at individ par artn tners challe Bus usin iness bu busin iness tract co idea eas attr moti tivati tion deci ecisi sions te team conce centration on at det etermina nation skyscr craper conce centration on com omple lexity agreement ag

  27. Sentiment Orientation Sentiment (or text) analysis can basically infer positive and negative – i.e. valence – opinions which people express either through voice or in writing

  28. Not So Fast... ‒ Not all expressions have a single meaning e.g. ”That’s so funny” – positive sentence; different meaning with • use of tone (sarcasm) Intensity: Rating: WEAK MEDIUM HIGH Good Wonderful Amazing Bad Poor Terrible

  29. Linguistics as a mean for Classifying Emotions Guidelines : “I’m happy that you’re here” “This upsets me” “That’s so funny ” Intensifier Usage of bad language [Cursing, Insulting, Blaming, etc.]

  30. References and Further Reading 1. Johar, Swati. Emotion, Affect and Personality in Speech. Springer, 2016. Print 2. R. Cowie, E. Douglas Cowie, N. Tsapatsoulis, S. Kollias, W. Fellenz, and J. G. Taylor. “Emotion Recognition in Human Computer Interaction”. IEEE Signal Processing Magazine, 18 (2001): 32-80. Print 3. Open source tool: https://sourceforge.net/projects/openart/. Open source project named openEAR (originated at TUM) 4. Knowledge center: http://sentic.net (Accessed on March, 2018)

  31. Different Types of Monitoring

  32. Autonomous Nervous System (ANS) Sympathetic Parasympathetic “ fight-or-flight ” “ rest and digest ” Source: Hemmings, H., Pharmacology and Physiology for Anesthesia: Foundations and Clinical Application. Saunders, 2013

  33. ANS – Continued Sympathetic Nervous System activity: Parasympathetic Nervous System activity:

  34. Human Heart

  35. Heart Rate Measurement A person heart rate can be extracted through either an ECG, or a PPG (usually smart watches nowadays)

  36. Heart Rate Variability ‒ Heart rate variability (HRV) is the variation of consecutive beat- to-beat (b2b) intervals ‒ It indicates the heart's ability to respond to stimuli such as breathing, exercise, stress, diseases or sleep ‒ Decreased with SNS; Increased with Parasympathetic

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend