Virtual Actors
Machine emulation of character gesture behaviour as portrayed by human actors
- By Sri Sri Perangur
Supervised by Dr. Suresh Manandhar
Virtual Actors Machine emulation of character gesture behaviour as - - PowerPoint PPT Presentation
Virtual Actors Machine emulation of character gesture behaviour as portrayed by human actors - By Sri Sri Perangur Supervised by Dr. Suresh Manandhar Project aims Objective : To produce automated human gesturing behaviour for personification of
Machine emulation of character gesture behaviour as portrayed by human actors
Supervised by Dr. Suresh Manandhar
Objective: To produce automated human gesturing behaviour for personification of humanoids.
Example: OK That way
Psychological study + Social psychological study + Social psychology study + Video analysis
Role theory + Film production
Hand & Mind [McNeil,2009] Gesture & Thought [McNeil,2006] The face [Ekman,2003] Human- Human nonverbal communication [Ekman & Frisch, 1991]
Influencing factors: Long term factors: Cultural use, regional use, … Immediate gestures : Recent history of events, relationship between speaker and audience, environment, … Classification difficulties rise due to : Multiple classification of gestures , Overlapping of gestures , … Thus gesture interpretation can be very ambiguous without context and history of context Overlapping gestures
Annotation: Dialogue annotation
Annotation: Dialogue annotation
Dialogue Act Emotion F_Gesture type F_Gesture Binary Head Movement Head Binary Eyebrow Movement Eye Movement Eye Binary Lip Mmt Lip Binary B_Gesture type B_Gesture Binary Body Gesture Body Part Question Annoyed Emblem Y None N AU3 Y None N None N None N None None State Annoyed None N Forward Y None N None N None N None N None None Sentence Self conscious S_Positive S_Negative Public Private Friend Colleague Acquaintance Stranger Is it all gone Kit? N N Y Y N N Y N N Hey, let's go. N N N Y N N Y N N
Annotation: Dialogue annotation
Dialogue Act Emotion F_Gesture type F_Gesture Binary Head Movement Head Binary Eyebrow Movement Eye Movement Eye Binary Lip Mmt Lip Binary B_Gesture type B_Gesture Binary Body Gesture Body Part Question Annoyed Emblem Y None N AU3 Y None N None N None N None None State Annoyed None N Forward Y None N None N None N None N None None Sentence Self conscious S_Positive S_Negative Public Private Friend Colleague Acquaintance Stranger Is it all gone Kit? N N Y Y N N Y N N Hey, let's go. N N N Y N N Y N N
Annotation: Gesture semantics model
Annotation: Overview
Basic setup : Environmental influencing factors + Emotion + Dialogue Act Movements : Facial gesture type + Facial movements + Body gesture type + Body movements
Embaressed (0.28%) Impressed (0.28%) Instruct (0.28%) Secretive (0.28%) Amused (0.57%) Disappointment (0.57%) Sympathy (0.57%) Excitement (1.71%) Please (1.71%) Vengeful (1.99%) Confused (2.28%) Surprised (2.56%) Angry (2.85%) Arrogant (2.85%) Thinking (3.13%) Joke (3.42%) Concern (4.56%) Annoyed (6.27%) Sad (9.12%) Interest (11.68%) Happy (21.08%) None (21.37%)
None (21.37%)
Face gesture (Y/N)
Face gesture Type (Classes)
Head Movement(Y/N)…
Head Movement (Classes)… Machine Learning methods implemented:
Testing: 20% data
Testing: 10% data
( known as J48 for java implementation)
Facial gesture prediction accuracy
89.24% 88.67% 90.08% 49.01% 49.01% 50.99% 39.38% 39.38% 39.66% 85.55% 75.92% 85.55% 0.00% 10.00% 20.00% 30.00% 40.00% 50.00% 60.00% 70.00% 80.00% 90.00% 100.00%
69.97% 69.69% 70.54% 69.20% 69.40% 69.60% 69.80% 70.00% 70.20% 70.40% 70.60% 70.80% Body Gesture- J48 Body Gesture- Naïve Body Gesture- SVM
Accuracy of Body movement predictions
247 353 315 350 106 38 3 50 100 150 200 250 300 350 400 Initial Body-J48 Body - Naïve Body - SVM None Non-None
Distribution of values between None and Non-none classes
Machine learning of human gesture behaviour is possible! Future Work: Machine mapping gestures and to word semantic in a sentence. Prediction of accurate time gesture expression period compared speech rate. Gesture science + Machine learning + Kinect + Stanford Parser Step closer to humanoid personification
Contact: Sri Sri Perangur at srisri.perangur@gmail.com or sp574@york.ac.uk or sp574@cs.york.ac.uk
Few of the research material used: [1] D. McNeil, Hand and Mind: What Gestures reveal about thought. The University of Chi-cago Press, 2009, p. 11. [2] K. R. Gibson and T. Ingold, Tools, Language and Cognition in Human Evolution. Cam-bridge University Press, 1993, p. 483. [6] Dr. S. Manandhar, “AMEDEUS: Slide1.” [Online]. Available: http://www.cs.york.ac.uk/amadeus/projects/uda/slide01.html. [7]K. M. Knutson, E. M. McClellan, and J. Grafman, “Observing social gestures: an fMRI study.” Experimental brain research. Experimentelle Hirnforschung. Expérimentation cérébrale, vol. 188, no. 2, pp. 187-98, Jun. 2008. [8] R. B. Zajonc, “Feeling and thinking: Preferences need no inferences.” [9] R. W. Picard, “AFFECTIVE COMPUTING FOR HCI,” in Proceedings of HCI International, 1999. [10] Xadhoom, “Facial expressions.” [Online]. Available: http://xadhoom.deviantart.com/art/3D-course-facial-expressions-3011857. [11] M. Montaner, B. López, and J. L. de la Rosa, “Developing trust in recommender agents,” in Proceedings of the first international joint conference on Autonomous agents and multia-gent systems part 1 - AAMAS ’02, 2002, p. 304. [13] V. Trehan and T. Y. Project, “Gesture Mark-up and an Analysis of Spontaneous Gestures,” Artificial Intelligence, vol. 802, 2003. [14] J. M. Pim, “Modelling and prediction of spontaneous gestures,” 2006. [15] S. K. Richmond, V. P., McCroskey, J. C., & Payne, “Nonverbal Behaviour in Interpersonal Relations.,” Englewood Cliffs, NJ, 1987. [16] M. L. Knapp, “Nonverbal Communication in Human Interaction,” Holt, New York: Rinehart and Winston., 1972. [17 ] L. A. at el. Malandro, Nonverbal Communication. Reading, U.K.: Addison-Wesley., 1983. [18] D. McNeil, “Gesture and Thought,” Continuum, 2006.