IIS - PROJECT 3 Group 8
Qian Sun Louis Janse van Rensburg Abdulghani Zubeir
IIS - PROJECT 3 Group 8 Qian Sun Louis Janse van Rensburg - - PowerPoint PPT Presentation
IIS - PROJECT 3 Group 8 Qian Sun Louis Janse van Rensburg Abdulghani Zubeir Previous Literature Virtual vs Physical Embodiment Costa et al. (2016) found Users are more attentive to physically embodied emotional storytelling (Robot with
Qian Sun Louis Janse van Rensburg Abdulghani Zubeir
Virtual vs Physical Embodiment
emotional storytelling (Robot with Facial Expression Capability) Nao Limitation: no facial motors, dependence on postural expression and use
Conveying Emotion without Facial Expression
(Coulson, 2004) to express Happiness, Sadness & Fear in a Nao robot
have ‘high’ auditory emotion level
expression
When Combining LED, Sound & Postural Capacities of Nao Robot: 1. Does rate of correct identification of the emotion expressed (Happiness, Sadness, Fear, Surprise, Anger, Disgust) differ between Virtual and Physical Embodiment Conditions? 2. Do ratings of naturalness and intensity differ between virtual and physical embodiment conditions:
○ Across all emotions ○ For each individual emotion
1. Emotion Postures for the 6 emotions derived from Coulson (2004)
○ For each emotion they determined the most identifiable posture from 32 possible candidate-postures
2. Transformation functions of joint angles of the happiness posture used to map postures to to the Nao robot in et al.
○ We extrapolated this transformation function to determine motor-joint angles in Nao postures for all emotions
3. LED Colors determined by Psychological Color-Emotion Association (Parker, 2012) 4. Emotion vocalization sounds selected from public soundbite sites
○ Physical/Virtual
Transformation Functions (Erden, 2013) Posture Joint Angles for 6 Basic Emotions (Coulson, 2004) Nao Motor Angles Implemented Using Choregraphe
○ All subjects experienced Virtual and Physical Embodiment conditions ○ Interleaved Order of Presentation of Physical/Virtual Conditions
○ For each trial in each physical/virtual condition
○ Classification of Emotion ○ Measure of Emotion Intensity (10 Point Likert Scale) ○ Measure of Emotion Naturalness (10 Point Likert Scale)
Mixed Ordering of Condition Presentation
7 Participants: Physical First Virtual Second 3 Participants: Virtual First, Physical Second
Following Each emotion expression participants filled out the following:
Sample:
○ 7 viewed Physical first, 3 viewed Virtual first
○ 3 Females ○ 7 Males ○ All master’s students Swedish & International
For the first viewing (Between Subjects Analysis)
Condition (7 Subjects)
(3 Subjects)
○ Chi2 value of 4.132, p = 0.0421 ○ Participants in the Physical-first condition were significantly better at classifying emotions than those in the Virtual-first condition.
Combining first and second viewings, McNemar Test was used
○ No significant difference in subjects ability to classify emotions between Physical and Virtual conditions (p = .125) when combining first and second viewings.
Intensity and Naturalness across all Emotions: Within Subjects (combined first and second viewing)
Paired Samples T-Tests
Paired Samples T-Tests
1. Upon first viewing, subjects were able to better classify emotions expressed by the physical robot compared to the virtual robot 2. The difference in ability to classify emotions between these conditions dissappeared when combining first and second viewing data (improvement bias). 3. Overall (across all emotions) the Physical robot was rated as more
emotion displayed any significant difference between physical/virtual conditions. 4. Overall, there was no significant difference in ratings of naturalness between the two conditions, however, Fear was significantly rated as more natural in the Physical robot.
physical and virtual conditions respectively)
sample set is too small & unbalanced to conduct meaningful ANOVA).
robot, one arm had to be moved inward possibly causing worse classification of Disgust
○ Virtuous emotions ○ Moral concerns ○ Emotions moral variations
○ Goodness ○ Alternative emotions
○ Emotions on act ○ Interactive emotions
manipulate people.
Costa, S. Et al., 2016. Emotional storytelling using virtual and robotic agents. Retrieved May 2017 from: https://arxiv.org/pdf/1607.05327.pdf Coulson, M., 2004. Attributing emotion to static body postures: Recognition accuracy, confusions, and view point
Erden, M. S., 2014. Emotional postures for humanoid nao robot. Int J Soc Robot 5: 441-446. Hyde, J. Et al., 2014. Assessing naturalness and emotional intensity: a perceptual study of animated facial motion. Proceedings of the ACM Symposium on Applied Perception. 1: 15-22. Paiva, A. Leite, I., Ribiero, T., 2014. Emotion Modelling for Social Robots. Retrieved May 2017 from: http://people.ict.usc.edu/~gratch/CSCI534/Readings/ACII-Handbook-Robots.pdf. Parker, R., 2012. The meaning of colours. Retrieved May 2017 from: https://resources.oncourse.iu.edu/access/content/user/rreagan/Filemanager_Public_Files/meaningofcolors.htm.