Making Sense of Multimodal Learning Analytics (MMLA)
Marcelo Worsley
Assistant Professor, Learning Sciences & Computer Science
Making Sense of Multimodal Learning Analytics (MMLA) Marcelo - - PowerPoint PPT Presentation
Making Sense of Multimodal Learning Analytics (MMLA) Marcelo Worsley Assistant Professor, Learning Sciences & Computer Science Agenda - What is MMLA? - Why use MMLA? - How to use MMLA? - Examples of MMLA research - On-going challenges and
Assistant Professor, Learning Sciences & Computer Science
Learning Analy9cs—a set of mul9-modal sensory inputs, that can be used to predict, understand and quan9fy student learning. (Worsley and Blikstein, 2011)
Mul9modal learning analy9cs (MMLA) (Blikstein & Worsley, in press.; Blikstein, 2013; Worsley, 2012) sits at the intersec9on of three ideas: mul9modal teaching and learning, mul9modal data, and computer-supported analysis. At its essence, MMLA u9lizes and triangulates among non-tradi9onal as well as tradi9onal forms of data in order to characterize or model student learning in complex learning environments. However, as we describe later, the ways that researchers u9lize mul9modal data vary widely. (Worsley, Abrahamson, Blikstein, Grover, Schneider and Tissenbaum, 2016)
VIDEO Pose Gaze Gestures Object Manipula9on Facial Expressions Proxemics Scene Understanding Movement Heartrate Object Tracking AUDIO Speaker Diariza9on Prosody Spectrum Emo9on/Affect Intensity Speaking Rate Collabora9on/Turn-Taking Voice Quality TEXT Emo9on/Affect Complexity Cohesion Seman9cs Syntac9cs Content EYE TRACKING DATA Fixa9ons Fixa9on Paeern Pupil Dila9on Eye loca9on Aeen9on Head Pose DEPTH CAMERA Gestures Body Posi9on Joint Tracking SMARTPHONES Signg/Standing Loca9on Noise Level LEAP MOTION Gestures Tool Usage Hand Movement Finger Movement EDA WATCH Arousal Cogni9ve Load Body Temperature Gestures Heartrate Stress
EVENT LOGS Ac9vity
OTHER MODALITIES fMRI ELECTROCARDIOLOGY ELECTROENCEPHALOGY ELECTROMYOGRAPHY ELECTROGASTROGRAPHY
environments
tradi9onal modali9es
human inference
designs
DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool
DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool
DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool
DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool
DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool
Pre-processing is important for data synchroniza9on, accoun9ng for individual differences between par9cipants, and gegng data in the appropriate format for data
VIDEO OpenFace (gaze) OpenFace (face recogni9on) Emo9ent FACET Noldus Face Reader OpenCV Affec9va Microsoh Face API Microsoh Emo9on API Intraface Mul9sense Eularian Magnifica9on Code iMo9ons AUDIO HTK Praat OpenSmile OpenEar Covarep ELAN ICSI Diarizer Matlab LIUM Diarizer CMU Sphinx Google ASR AT&T Watson ASR Bing Speech API Audacity Emovoice TEXT Natural Language Toolkit Lightside LIWC Cohmetrix Tone Analyzer Stanford Parser Wordnet Sen9wordnet Mallet Word2Vec
EYE TRACKING DATA PyGaze Ogama EyeTrackingR Tobii SDK SMI SDK iMo9ons Aeen9on Tool Pupil Dila9on
DEPTH CAMERA OpenNUI Kinect for Windows QSRLib Libfreenect
EDA WATCH Ledalab DATA EXTRACTION SOFTWARE Open Social Signal Interpreta9on
Human Coders
Let’s say that you want to study engagement and have audio, video, bio-physiological and gesture data available for analysis. How do you use these to compute a measure of engagement?
Xbox Kinect
Gaze)
Movement)
RapidMiner
Researcher(s)
Xbox Kinect
Gaze)
Movement)
RapidMiner
Researcher(s) Custom
Mul9modal Learning Analy9cs Special Interest Group - hep://sigmla.org CrossMMLA Workshops EC-TEL 2017, LAK 2018 (tenta9ve) MMLA Workshops @ ICMI, LAK, ICLS (2012 – Present) SOLAR LASI – Mul9modal Learning Analy9cs Tutorial Workshops Journal of Learning Analy9cs Special Sec9on on Mul9modal Learning Analy9cs CIRCL Cyberlearning Report
Berland, M., Baker, R. S., & Blikstein, P. (2014). Educa9onal Data Mining and Learning Analy9cs: Applica9ons to Construc9onist Research. Technology, Knowledge and Learning, 19(1–2), 205–220. Blikstein, P., & Worsley, M. (2016). Mul9modal Learning Analy9cs and Educa9on Data Mining : using computa9onal technologies to measure complex learning tasks, 3(2), 220–238. heps://doi.org/ hep://dx.doi.org/10.18608/jla.2016.32.11 Fouse, A. S. (2011). ChronoViz : A system for suppor9ng naviga9on of 9me-coded data. Chi, 1–6. heps://doi.org/10.1145/1979742.1979706 Grafsgaard, J. F. (2014). Mul9modal Analysis and Modeling of Nonverbal Behaviors During Tutoring. In Proceedings of the 16th Interna=onal Conference on Mul=modal Interac=on (pp. 404–408). New York, NY, USA: ACM. heps://doi.org/10.1145/2663204.2667611 Leong, C. W., Chen, L., Feng, G., Lee, C. M., & Mulholland, M. (2015). U9lizing depth sensors for analyzing mul9modal presenta9ons: Hardware, sohware and toolkits. ICMI 2015 - Proceedings of the 2015 ACM Interna=onal Conference on Mul=modal Interac=on, (3), 547–556. heps://doi.org/10.1145/2818346.2830605 Luz, S. (2013). Automa9c iden9fica9on of experts and performance predic9on in the mul9modal math data corpus through analysis of speech interac9on. Proceedings of the 15th ACM on Interna=onal Conference on Mul=modal Interac=on - ICMI ’13, 575–582. heps://doi.org/10.1145/2522848.2533788 Morency, L.-P., Oviae, S., Scherer, S., Weibel, N., & Worsley, M. (2013). ICMI 2013 grand challenge workshop on mul9modal learning analy9cs. Proceedings of the 15th ACM on Interna=onal Conference
Oviae, S., & Cohen, A. (2013). Wrieen and mul9modal representa9ons as predictors of exper9se and problem-solving success in mathema9cs. … 15th ACM on Interna=onal Conference on Mul=modal …, 1–8. Retrieved from hep://dl.acm.org/cita9on.cfm?id=2533793 Scherer, S., Worsley, M., & Morency, L.-P. (2012). 1st interna9onal workshop on mul9modal learning analy9cs. In ICMI (pp. 609–610). Schneider, B., & Blikstein, P. (2015). Unraveling Students’ Interac9on Around a Tangible Interface using Mul9modal Learning Analy9cs. Journal of Educa=onal Data Mining. Spikol, D. (2017). Using Mul9modal Learning Analy9cs to Iden9fy Aspects of Collabora9on in Project-Based Learning Introduc9on PELARS system and context. Cscl, (June), 263–270. heps://doi.org/ 10.22318/cscl2017.37 Thompson, K. (2013). Using micro-paeerns of speech to predict the correctness of answers to mathema9cs problems: an exercise in mul9modal learning analy9cs. … 15th ACM on Interna=onal Conference on Mul=modal …. Retrieved from hep://dl.acm.org/cita9on.cfm?id=2533792 Worsley, M. (2012). Mul9modal Learning Analy9cs - Enabling the future of learning through mul9modal data analysis and interfaces. Interna=onal Conference on Mul=modal Interac=on, 12–15. heps:// doi.org/10.1145/2388676.2388755 Worsley, M., Scherer, S., Morency, L. P., & Blikstein, P. (2016). Exploring Behavior Representa9on for Learning Analy9cs. Proceedings of the 2015 Interna=onal Conference on Mul=modal Interac=on (ICMI), 251–258. Xavier Ochoa, Marcelo Worsley, Katherine Chiluiza, & Saturnino Luz. (2014). MLA’14: Third Mul9modal Learning Analy9cs Workshop and Grand Challenges. Proceedings of the 16th Interna=onal Conference on Mul=modal Interac=on, 531–532. heps://doi.org/10.1145/2663204.2668318
39
Assistant Professor Electrical Engineering and Computer Science & Learning Sciences marcelo.worsley@northwestern.edu Marceloworsley.com 9ilt.northwestern.edu