virtual actors
play

Virtual Actors Machine emulation of character gesture behaviour as - PowerPoint PPT Presentation

Virtual Actors Machine emulation of character gesture behaviour as portrayed by human actors - By Sri Sri Perangur Supervised by Dr. Suresh Manandhar Project aims Objective : To produce automated human gesturing behaviour for personification of


  1. Virtual Actors Machine emulation of character gesture behaviour as portrayed by human actors - By Sri Sri Perangur Supervised by Dr. Suresh Manandhar

  2. Project aims Objective : To produce automated human gesturing behaviour for personification of humanoids. 1. Develop a structure to identify the semantics expressed in a gesture Example:  OK  That way 2. Identify a set of stimuli that influences a Person’s gesture style the most : Psychological study + Social psychological study + Social psychology study + Video analysis 3. System development model : Role theory + Film production 4. Machine Learning of gestures : - Annotation - Machine learning methods : C.45 , Naïve Bayes & S.V.M – 5 & 10 Fold

  3. Why Gestures? Definition, use, importance • Gestures are a person’s memories and thoughts rendered visible. Hand & Mind [McNeil,2009] Gesture & Thought [McNeil,2006] The face [Ekman,2003] Human- Human nonverbal communication [Ekman & Frisch, 1991]

  4. Gesture interpretation Influencing factors: Overlapping gestures Long term factors: Cultural use, regional use, … Immediate gestures : Recent history of events, relationship between speaker and audience, environment, … Classification difficulties rise due to : Multiple classification of gestures , Overlapping of gestures , … Thus gesture interpretation can be very ambiguous without context and history of context

  5. Why Virtual ‘Actors’? • Emulation of human emotion and thought not replication • Emulation of natural environmental stimuli = Film set • Expression emulation: No feeling - Just acting = Films acting Role theory: Humans hold social positions + Humans play ‘ role ’

  6. Annotation: Dialogue annotation

  7. Annotation: Dialogue annotation Sentence Self conscious S_Positive S_Negative Public Private Friend Colleague Acquaintance Stranger Is it all gone Kit? N N Y Y N N Y N N Hey, let's go. N N N Y N N Y N N Dialogue F_Gesture F_Gesture Head Head Eyebrow Eye Eye Lip Lip B_Gesture B_Gesture Body Body Act Emotion type Binary Movement Binary Movement Movement Binary Mmt Binary type Binary Gesture Part Question Annoyed Emblem Y None N AU3 Y None N None N None N None None State Annoyed None N Forward Y None N None N None N None N None None

  8. Annotation: Dialogue annotation Sentence Self conscious S_Positive S_Negative Public Private Friend Colleague Acquaintance Stranger Is it all gone Kit? N N Y Y N N Y N N Hey, let's go. N N N Y N N Y N N Dialogue F_Gesture F_Gesture Head Head Eyebrow Eye Eye Lip Lip B_Gesture B_Gesture Body Body Act Emotion type Binary Movement Binary Movement Movement Binary Mmt Binary type Binary Gesture Part Question Annoyed Emblem Y None N AU3 Y None N None N None N None None State Annoyed None N Forward Y None N None N None N None N None None

  9. Annotation: Gesture semantics model

  10. Annotation: Overview Basic setup : Environmental influencing factors + Emotion + Dialogue Act Movements : Facial gesture type + Facial movements + Body gesture type + Body movements None (21.37%) Embaressed (0.28%) Impressed (0.28%) Instruct (0.28%) Secretive (0.28%) Amused (0.57%) Disappointment (0.57%) Sympathy (0.57%) Excitement (1.71%) Please (1.71%) Vengeful (1.99%) Confused (2.28%) Surprised (2.56%) Angry (2.85%) Arrogant (2.85%) Thinking (3.13%) Joke (3.42%) Concern (4.56%) Annoyed (6.27%) Sad (9.12%) Interest (11.68%) Happy (21.08%) None (21.37%)

  11. Machine learning stage Face gesture (Y/N)

  12. Machine learning stage Face gesture Type (Classes)

  13. Machine learning stage Head Movement(Y/N)…

  14. Machine learning stage Machine Learning methods implemented: • Naïve Bayes • Support Vector Machine (S.V.M.) - 5 Fold method i.e. Training: 80% data Testing: 20% data - 10 Fold method i.e. Training: 90% data Testing: 10% data • C.45 Algorithm ( known as J48 for java implementation) Head Movement (Classes)…

  15. Predicting gesture

  16. Prediction accuracy: Facial expressions Facial gesture prediction accuracy 100.00% 89.24% 88.67% 90.08% 90.00% 85.55% 85.55% 75.92% 80.00% 70.00% 60.00% 49.01% 49.01% 50.99% 50.00% 39.38% 39.38% 39.66% 40.00% 30.00% 20.00% 10.00% 0.00%

  17. Body gesture predictions Distribution of values between None and Non-none classes Accuracy of Body movement predictions 70.80% 400 70.60% 70.54% 353 350 350 70.40% 315 300 70.20% 247 250 69.97% 70.00% 200 69.80% 69.69% 150 106 69.60% 100 69.40% 38 50 3 0 69.20% 0 Body Gesture- Body Gesture- Body Gesture- Initial Body-J48 Body - Naïve Body - SVM J48 Naïve SVM None Non-None

  18. Conclusion & Future work Machine learning of human gesture behaviour is possible! Future Work: Machine mapping gestures and to word semantic in a sentence. Prediction of accurate time gesture expression period compared speech rate. Gesture science + Machine learning + Kinect + Stanford Parser  Step closer to humanoid personification

  19. Any Questions? Contact: Sri Sri Perangur at srisri.perangur@gmail.com or sp574@york.ac.uk or sp574@cs.york.ac.uk Few of the research material used: [1] D. McNeil, Hand and Mind: What Gestures reveal about thought . The University of Chi-cago Press, 2009, p. 11. [2] K. R. Gibson and T. Ingold, Tools, Language and Cognition in Human Evolution . Cam-bridge University Press, 1993, p. 483. [6] Dr. S. Manandhar , “AMEDEUS: Slide1.” [Online]. Available: http://www.cs.york.ac.uk/amadeus/projects/uda/slide01.html. [7]K. M. Knutson, E. M. McClellan, and J. Grafman , “Observing social gestures: an fMRI study.” Experimental brain research. Experimentelle Hirnforschung. Expérimentation cérébrale , vol. 188, no. 2, pp. 187-98, Jun. 2008. [8] R. B. Zajonc , “Feeling and thinking: Preferences need no inferences.” [9] R. W. Picard, “AFFECTIVE COMPUTING FOR HCI,” in Proceedings of HCI International , 1999. [10] Xadhoom , “Facial expressions.” [Online]. Available: http://xadhoom.deviantart.com/art/3D -course-facial-expressions-3011857. [11] M. Montaner, B. López , and J. L. de la Rosa, “Developing trust in recommender agents,” in Proceedings of the first international joint conference on Autonomous agents and multia-gent systems part 1 - AAMAS ’02 , 2002, p. 304. [13] V. Trehan and T. Y. Project, “Gesture Mark -up and an Analysis of Spontaneous Gestures,” Artificial Intelligence , vol. 802, 2003. [14] J. M. Pim , “Modelling and prediction of spontaneous gestures,” 2006. [15] S. K. Richmond, V. P., McCroskey , J. C., & Payne, “Nonverbal Behaviour in Interpersonal Relations.,” Englewood Cliffs, NJ, 1987. [16] M. L. Knapp, “Nonverbal Communication in Human Interaction,” Holt, New York: Rinehart and Winston., 1972. [17 ] L. A. at el. Malandro, Nonverbal Communication . Reading, U.K.: Addison-Wesley., 1983. [18] D. McNeil, “Gesture and Thought,” Continuum , 2006.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend