IIS - PROJECT 3 Group 8 Qian Sun Louis Janse van Rensburg - - PowerPoint PPT Presentation

iis project 3 group 8
SMART_READER_LITE
LIVE PREVIEW

IIS - PROJECT 3 Group 8 Qian Sun Louis Janse van Rensburg - - PowerPoint PPT Presentation

IIS - PROJECT 3 Group 8 Qian Sun Louis Janse van Rensburg Abdulghani Zubeir Previous Literature Virtual vs Physical Embodiment Costa et al. (2016) found Users are more attentive to physically embodied emotional storytelling (Robot with


slide-1
SLIDE 1

IIS - PROJECT 3 Group 8

Qian Sun Louis Janse van Rensburg Abdulghani Zubeir

slide-2
SLIDE 2

Previous Literature

Virtual vs Physical Embodiment

  • Costa et al. (2016) found Users are more attentive to physically embodied

emotional storytelling (Robot with Facial Expression Capability) Nao Limitation: no facial motors, dependence on postural expression and use

  • f non-facial expression capacities (Sound/Voice, LED Colors)
slide-3
SLIDE 3

Previous Literature (cont.)

Conveying Emotion without Facial Expression

  • Erden (2013) used postural expression derived from behavioural studies

(Coulson, 2004) to express Happiness, Sadness & Fear in a Nao robot

  • Hyde et al. (2014) found users rate emotions more intensely when agents

have ‘high’ auditory emotion level

  • Paiva et al. (2014) highlight use of color (LEDs) in robot emotion

expression

slide-4
SLIDE 4

Research Questions

When Combining LED, Sound & Postural Capacities of Nao Robot: 1. Does rate of correct identification of the emotion expressed (Happiness, Sadness, Fear, Surprise, Anger, Disgust) differ between Virtual and Physical Embodiment Conditions? 2. Do ratings of naturalness and intensity differ between virtual and physical embodiment conditions:

○ Across all emotions ○ For each individual emotion

slide-5
SLIDE 5

Methodology

1. Emotion Postures for the 6 emotions derived from Coulson (2004)

○ For each emotion they determined the most identifiable posture from 32 possible candidate-postures

2. Transformation functions of joint angles of the happiness posture used to map postures to to the Nao robot in et al.

○ We extrapolated this transformation function to determine motor-joint angles in Nao postures for all emotions

3. LED Colors determined by Psychological Color-Emotion Association (Parker, 2012) 4. Emotion vocalization sounds selected from public soundbite sites

slide-6
SLIDE 6

Deriving Postures

○ Physical/Virtual

Transformation Functions (Erden, 2013) Posture Joint Angles for 6 Basic Emotions (Coulson, 2004) Nao Motor Angles Implemented Using Choregraphe

slide-7
SLIDE 7

Experimental Design

  • Within Subjects & Between Subjects

○ All subjects experienced Virtual and Physical Embodiment conditions ○ Interleaved Order of Presentation of Physical/Virtual Conditions

  • Randomized order of emotion presentation

○ For each trial in each physical/virtual condition

  • Questionnaire

○ Classification of Emotion ○ Measure of Emotion Intensity (10 Point Likert Scale) ○ Measure of Emotion Naturalness (10 Point Likert Scale)

slide-8
SLIDE 8

Within and Between Subjects

Mixed Ordering of Condition Presentation

7 Participants: Physical First Virtual Second 3 Participants: Virtual First, Physical Second

slide-9
SLIDE 9

Questionnaire

Following Each emotion expression participants filled out the following:

slide-10
SLIDE 10

Demonstration (video)

slide-11
SLIDE 11

Results

Sample:

  • 10 Subjects

○ 7 viewed Physical first, 3 viewed Virtual first

  • Demographics:

○ 3 Females ○ 7 Males ○ All master’s students Swedish & International

slide-12
SLIDE 12

Results: Emotion Classification (Between Subjects)

For the first viewing (Between Subjects Analysis)

  • 41/42 (97.6%) Correct Classifications of Emotions in the Physical-First

Condition (7 Subjects)

  • 15/18 (83.3%) Correct Classifications of Emotions in Virtual-First Condition

(3 Subjects)

  • Chi Square Analysis

○ Chi2 value of 4.132, p = 0.0421 ○ Participants in the Physical-first condition were significantly better at classifying emotions than those in the Virtual-first condition.

slide-13
SLIDE 13

Results: Emotion Classification (Within Subjects)

Combining first and second viewings, McNemar Test was used

  • Note: Chi Square cannot be used for within subjects analysis due to violation
  • f independent observations assumption
  • 59/60 (98.3%) Correct Classifications of all Physical Emotion Expressions
  • 56/60 (93.33%) Correct Classifications of all Virtual Emotion Expressions
  • McNemar Test

○ No significant difference in subjects ability to classify emotions between Physical and Virtual conditions (p = .125) when combining first and second viewings.

slide-14
SLIDE 14

5/120 Misclassifications

  • Anger misclassified once as Disgust
  • Disgust misclassified 3 times as Fear (to be discussed, no pun intended)
  • Surprise misclassified once as Happiness
slide-15
SLIDE 15

Results: Paired Sample T-Tests Virtual vs Physical

Intensity and Naturalness across all Emotions: Within Subjects (combined first and second viewing)

slide-16
SLIDE 16

Physical vs Virtual: Intensity by Emotion (within subjects)

Paired Samples T-Tests

  • No sig. differences
slide-17
SLIDE 17

Physical vs Virtual: Naturalness by Emotion (within subjects)

Paired Samples T-Tests

  • Sig. difference for Fear only
slide-18
SLIDE 18

Conclusions and Discussion

1. Upon first viewing, subjects were able to better classify emotions expressed by the physical robot compared to the virtual robot 2. The difference in ability to classify emotions between these conditions dissappeared when combining first and second viewing data (improvement bias). 3. Overall (across all emotions) the Physical robot was rated as more

  • intense. However, when looking at individual emotions, no individual

emotion displayed any significant difference between physical/virtual conditions. 4. Overall, there was no significant difference in ratings of naturalness between the two conditions, however, Fear was significantly rated as more natural in the Physical robot.

slide-19
SLIDE 19

Limitations

  • Small Sample Size
  • Unbalanced samples between subjects first viewing (7 vs 3 subjects in

physical and virtual conditions respectively)

  • Within subjects design prohibits use of ANOVA (and between subjects

sample set is too small & unbalanced to conduct meaningful ANOVA).

  • Transformation Functions restricted by Range of Nao Robot Movements
  • Motor Angles for Disgust resulted in unbalanced centre of weight of

robot, one arm had to be moved inward possibly causing worse classification of Disgust

  • Questionnaire framing: Classification limited to a set of 6 possible
  • answers. Would be better if subjects generated an answer themselves.
slide-20
SLIDE 20

Ethical Concerns

  • Virtue ethics

○ Virtuous emotions ○ Moral concerns ○ Emotions moral variations

  • Consequentialism

○ Goodness ○ Alternative emotions

  • Deontologist:

○ Emotions on act ○ Interactive emotions

  • General Concern: Emotion expression in robots could be used to

manipulate people.

slide-21
SLIDE 21

References

Costa, S. Et al., 2016. Emotional storytelling using virtual and robotic agents. Retrieved May 2017 from: https://arxiv.org/pdf/1607.05327.pdf Coulson, M., 2004. Attributing emotion to static body postures: Recognition accuracy, confusions, and view point

  • dependence. Journal of Nonverbal Behaviour: 28(2): 1-23.

Erden, M. S., 2014. Emotional postures for humanoid nao robot. Int J Soc Robot 5: 441-446. Hyde, J. Et al., 2014. Assessing naturalness and emotional intensity: a perceptual study of animated facial motion. Proceedings of the ACM Symposium on Applied Perception. 1: 15-22. Paiva, A. Leite, I., Ribiero, T., 2014. Emotion Modelling for Social Robots. Retrieved May 2017 from: http://people.ict.usc.edu/~gratch/CSCI534/Readings/ACII-Handbook-Robots.pdf. Parker, R., 2012. The meaning of colours. Retrieved May 2017 from: https://resources.oncourse.iu.edu/access/content/user/rreagan/Filemanager_Public_Files/meaningofcolors.htm.