Contact-less Based Accelerometers Speaker: Refael Shamir Founder - - PowerPoint PPT Presentation

contact less based accelerometers
SMART_READER_LITE
LIVE PREVIEW

Contact-less Based Accelerometers Speaker: Refael Shamir Founder - - PowerPoint PPT Presentation

March 26 29, 2018 | Silicon Valley | #GTC18 www.gputechconf.com Affective Categorization Using Contact-less Based Accelerometers Speaker: Refael Shamir Founder and CEO of Letos Presentation Outline Motivation Driver monitoring in


slide-1
SLIDE 1

Affective Categorization Using Contact-less Based Accelerometers

March 26—29, 2018 | Silicon Valley | #GTC18

www.gputechconf.com

Speaker: Refael Shamir Founder and CEO of Letos

slide-2
SLIDE 2

Motivation

– Driver monitoring in the new age

Background and Definitions

– First steps to understanding Affect Categorization

Technology Review

– Facial Expressions

– Eye Tracking

– Voice Recognition – Wearable Monitoring

– EEG, ECG, GSR, PPG…

– Sentiment Analysis

Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

Presentation Outline

slide-3
SLIDE 3

Misconception?

slide-4
SLIDE 4

Motivation

– Driver monitoring in the new age

Background and Definitions

– First steps to understanding Affect Categorization

Technology Review

– Facial Expressions

– Eye Tracking

– Voice Recognition – Wearable Monitoring

– EEG, ECG, GSR, PPG…

– Sentiment Analysis

Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

Presentation Outline

slide-5
SLIDE 5

Motivation

slide-6
SLIDE 6

Motivation

‒ There is a growing debate on the tracking of in-cabin monitoring (e.g. tracking alertness) ‒ Using gaze estimation is only part of the solution

‒ Keeping eyes on the road, does not proclaim alertness level with a good confidence

‒ Need to track engagement level of the driver at all times

slide-7
SLIDE 7

Motivation

– Driver monitoring in the new age

Background and Definitions

– First steps to understanding Affect Categorization

Technology Review

– Facial Expressions

– Eye Tracking

– Voice Recognition – Wearable Monitoring

– EEG, ECG, GSR, PPG…

– Sentiment Analysis

Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

Presentation Outline

slide-8
SLIDE 8

Background and Definitions

slide-9
SLIDE 9

Background and Definitions – Cnt’d

Affective Computing: Picard first introduced the term “affective computing” in 1995, as a mean to evaluate different emotions, or expressions, from a computer perspective

Affective Computing (MIT Press); Rosalind

  • W. Picard, 1997
slide-10
SLIDE 10

Arousal (Y-axis) – Indicates excitement/engagement level Valence (X-axis) – Indicates pleasure/comfort level

The Dimensional Affective State Model

Positive Negative Low High

I II III IV

ANGER SADNESS FEAR JOY SURPRISE DISGUST RELIEF

I II III IV IV ???

slide-11
SLIDE 11

Motivation

– Driver monitoring in the new age

Background and Definitions

– First steps to understanding Affect Categorization

Technology Review

– Facial Expressions

– Eye Tracking

– Voice Recognition – Wearable Monitoring

– EEG, ECG, GSR, PPG…

– Sentiment Analysis

Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

Presentation Outline

slide-12
SLIDE 12

Tools for Evaluating Affective States

slide-13
SLIDE 13

Suggested Prototype for Auto-Classification

Pre- Processing & Object Detection Feature Extraction

Input Source

Post- Processing Training Model Feature Selection Classifier

Output

slide-14
SLIDE 14
slide-15
SLIDE 15

Ekman’s Model

Paul Ekman argues that there are 6 basic facial expressions which are uniquely distinguished from one another, that have a relationship with an emotional state (Ekman, 1972) These set of emotions, according to Ekman, are being expressed across humans, regardless of age, gender, race, or culture

slide-16
SLIDE 16

Facial Expressions

36 21

slide-17
SLIDE 17

Feature Extraction for Affect Classification

Geometric Features

– Detecting the face (shape/size) – Detect cue points (lips, eyebrow) → Categorize emotions based on relative position to the face

Appearance Based

– Detecting the face (shape/size) – Texture layering (filters) → Categorize emotions based on extracted feature type

slide-18
SLIDE 18

Technology Analysis

Capturing the duration of the emotion

0.5 – 4 seconds

Positive and negative differentiation Spontaneous reaction (faking?) Evaluating intensity level (arousal)

Valence Positive Negative

Confidence Can still be solved using a camera!

slide-19
SLIDE 19

Human Eye

Pupil Iris Eyelashes Eyebrows Sclera Eyelids

slide-20
SLIDE 20

How Sherlock Does It

https://www.youtube.com/v/- bBHT158E0s?start=107&end=128

slide-21
SLIDE 21

Pupillary Response - Explained

2mm 8mm

During rest, the eye’s pupil usually constricts, due to parasympathetic activity When presenting a stimuli, the eye’s pupil tends to dilate, due to sympathetic activity

(Bradley, Miccoli, Escrig, & Lang, 2008)

slide-22
SLIDE 22

References and Further Reading

1.Li and Jain. Handbook of Face Recognition, 2nd Edition. New York: Springer, 2011. Print 2.R. Cowie, E. Douglas Cowie, N. Tsapatsoulis, S. Kollias, W. Fellenz, and J. G. Taylor. “Emotion Recognition in Human Computer Interaction”. IEEE Signal Processing Magazine, 18 (2001): 32-80. Print 3.Open source tool: “Free and open source face recognition with deep neural networks”. OpenFace. GitHub (accessed February, 2018)

slide-23
SLIDE 23
slide-24
SLIDE 24

Human Speech – Overview

‒ Speech is basically a stream of words spoken in a particular way ‒ In order to differentiate between different syllables the vocal cords vibrates, and sound is sequentially being filtered through the mouth and nose ‒ In general, speech is carried over an anchor frequency (which varies within different scenarios). This is often abbreviated as ‘F0’

https://www.youtube.com/watch?v=yxxRAHVtafI

slide-25
SLIDE 25

Speech Recognition - Background

Human speech can be modeled through differentiating what is being transmitted during the message, and what is its intended affect Primary Secondary

“This upsets me” “That’s so funny” “I’m so happy”

slide-26
SLIDE 26

Feature Extraction for Affect Classification

Pitch: Usually, compared to the base frequency F0 Voice/Volume Level: Higher levels might indicate anger or fear Speaking rate: Can indicate speaker’s confidence level

Generally, voice indicates merely arousal level

slide-27
SLIDE 27

Speech to Text

Source: https://hacks.mozilla.org/2017/11/a-journey-to-10-word-error-rate/

slide-28
SLIDE 28

meaning

  • pportunity

mar arket mar arket mar arket communicati tion Bus usin iness bu busin iness smilin ing ded edic icati tion fi financia ial suc uccessful am ambi bitio ious

de development

em emplo loyment

co corporation

challe llenge wor

  • rking

na navigation

moti tivati tion

conference ce individ idual connect ction en enterprise se cust stomers prosp sperity com

  • mple

lexity com

  • mpetit

itor colle llect ctive une nemployment inf nformatio ion te tech chnolo

  • logy

par artn tners

conce centration

  • n

ag agreement skyscr craper deci ecisi sions

idea eas

det etermina nation conce centration

  • n

at attr tract ctive chall llenge diff ifferent at attr tract ctive at attr tract ctive confi fident

women

contemporary det etermina nation

te team copy

corp

  • rporation

individ idual as asso soci ciate as assi sist stance ce

computa tati tion

depressi ssion com

  • mple

lexity com

  • mple

lexity

ins nspir ire

professi ssional int nteracti ction ad adversity ty begin beginnin ing Busi usiness ss wom

  • men

Busi usiness ss man man com

  • mple

lexity

smilin ing

sel elective

cust stomers

road

Ent ntrepreneur

  • rganization
  • n

part artnershi hip pers erspecti ctive ef effect ctive pers ersist stence ce

investment

slide-29
SLIDE 29

Sentiment Orientation

Sentiment (or text) analysis can basically infer positive and negative – i.e. valence – opinions which people express either through voice or in writing

slide-30
SLIDE 30

Not So Fast...

‒ Not all expressions have a single meaning

  • e.g. ”That’s so funny” – positive sentence; different meaning with

use of tone (sarcasm)

Intensity: Rating:

WEAK MEDIUM HIGH Good Wonderful Amazing Bad Poor Terrible

slide-31
SLIDE 31

Linguistics as a mean for Classifying Emotions

Guidelines:

“This upsets me” “That’s so funny” “I’m happy that you’re here”

Intensifier

Usage of bad language [Cursing, Insulting, Blaming, etc.]

slide-32
SLIDE 32

References and Further Reading

  • 1. Johar, Swati. Emotion, Affect and Personality in Speech. Springer, 2016. Print
  • 2. R. Cowie, E. Douglas Cowie, N. Tsapatsoulis, S. Kollias, W. Fellenz, and J. G. Taylor.

“Emotion Recognition in Human Computer Interaction”. IEEE Signal Processing Magazine, 18 (2001): 32-80. Print

  • 3. Open source tool:

https://sourceforge.net/projects/openart/. Open source project named openEAR (originated at TUM)

  • 4. Knowledge center: http://sentic.net (Accessed on March, 2018)
slide-33
SLIDE 33
slide-34
SLIDE 34

Different Types of Monitoring

slide-35
SLIDE 35

Autonomous Nervous System (ANS)

Source: Hemmings, H., Pharmacology and Physiology for Anesthesia: Foundations and Clinical

  • Application. Saunders, 2013

Sympathetic “fight-or-flight” Parasympathetic “rest and digest”

slide-36
SLIDE 36

ANS – Continued

Sympathetic Nervous System activity: Parasympathetic Nervous System activity:

slide-37
SLIDE 37

Human Heart

slide-38
SLIDE 38

Heart Rate Measurement

A person heart rate can be extracted through either an ECG,

  • r a PPG (usually smart watches nowadays)
slide-39
SLIDE 39

Heart Rate Variability

‒ Heart rate variability (HRV) is the variation of consecutive beat- to-beat (b2b) intervals ‒ It indicates the heart's ability to respond to stimuli such as breathing, exercise, stress, diseases or sleep ‒ Decreased with SNS; Increased with Parasympathetic

slide-40
SLIDE 40

Presentation Outline

Technology Overview

– Companies; Market; Use Cases

Background and Definitions

– First steps to understanding Affect Categorization

Technology Review

– Facial Expressions – Voice Recognition – Wearable Monitoring Devices

– EEG, ECG, GSR, PPG…

– Sentiment Analysis

Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

slide-41
SLIDE 41

Current State of the Art

NEGATIVE NEUTRAL POSITIVE Currently, most commercial products gives merely a differentiation between Positive and Negative emotions

SADNESS ANGER MISERY CALM JOY HAPPINESS ENTHUSIASM

STRESS!

slide-42
SLIDE 42

Challenges

Self Assessment: Multi-Modal Approach: Spontaneous; Unobtrusive Awareness

slide-43
SLIDE 43

Presentation Outline

Technology Overview

– Companies; Market; Use Cases

Background and Definitions

– First steps to understanding Affect Categorization

Technology Review

– Facial Expressions – Voice Recognition – Wearable Monitoring Devices

– EEG, ECG, GSR, PPG…

– Sentiment Analysis

Current State of the Art – Gap and Challenges Introducing Letos – How, When and Where

slide-44
SLIDE 44
slide-45
SLIDE 45

Recognizing anonymously user emotional reaction within different scenarios

slide-46
SLIDE 46

Solution

Advanced Machine Learning techniques for performing affective classification based on physiological signals 5 Emotional States ML/DL for Classifying Emotional States Complete Anonymity

slide-47
SLIDE 47

Product

slide-48
SLIDE 48

Product – Features

Heart Rate; Heart Rate Variability Respiration Rate Contact-less single sensor solution

MEMS

1 Sample/Second Wireless Communication

slide-49
SLIDE 49

Demonstration

slide-50
SLIDE 50

Demonstration

Respiration Rate Heart Rate

slide-51
SLIDE 51

Ballistocardiography

Ballistocardiography is a non-invasive method based on the measurement of the body motion generated by the ejection of the blood at each cardiac cycle. It is one of the many methods relying on detection of cardiac and cardiovascular-related mechanical motions, such as phonocardiography, apexcardiography, seismocardiography, kinetocardiography to list just a few.

http://people.csail.mit.edu/balakg/pulsefromheadmotion.html

slide-52
SLIDE 52

www.letos.me

Thank You

info@letos.me