Making Sense of Multimodal Learning Analytics (MMLA) Marcelo - - PowerPoint PPT Presentation

making sense of multimodal learning analytics mmla
SMART_READER_LITE
LIVE PREVIEW

Making Sense of Multimodal Learning Analytics (MMLA) Marcelo - - PowerPoint PPT Presentation

Making Sense of Multimodal Learning Analytics (MMLA) Marcelo Worsley Assistant Professor, Learning Sciences & Computer Science Agenda - What is MMLA? - Why use MMLA? - How to use MMLA? - Examples of MMLA research - On-going challenges and


slide-1
SLIDE 1

Making Sense of Multimodal Learning Analytics (MMLA)

Marcelo Worsley

Assistant Professor, Learning Sciences & Computer Science

slide-2
SLIDE 2
  • What is MMLA?
  • Why use MMLA?
  • How to use MMLA?
  • Examples of MMLA research
  • On-going challenges and future directions
  • Resources

Agenda

slide-3
SLIDE 3

What is MMLA?

Learning Analy9cs—a set of mul9-modal sensory inputs, that can be used to predict, understand and quan9fy student learning. (Worsley and Blikstein, 2011)

slide-4
SLIDE 4

What is MMLA?

Mul9modal learning analy9cs (MMLA) (Blikstein & Worsley, in press.; Blikstein, 2013; Worsley, 2012) sits at the intersec9on of three ideas: mul9modal teaching and learning, mul9modal data, and computer-supported analysis. At its essence, MMLA u9lizes and triangulates among non-tradi9onal as well as tradi9onal forms of data in order to characterize or model student learning in complex learning environments. However, as we describe later, the ways that researchers u9lize mul9modal data vary widely. (Worsley, Abrahamson, Blikstein, Grover, Schneider and Tissenbaum, 2016)

slide-5
SLIDE 5

What is MMLA?

slide-6
SLIDE 6

What is MMLA?

slide-7
SLIDE 7

What is MMLA?

slide-8
SLIDE 8

VIDEO Pose Gaze Gestures Object Manipula9on Facial Expressions Proxemics Scene Understanding Movement Heartrate Object Tracking AUDIO Speaker Diariza9on Prosody Spectrum Emo9on/Affect Intensity Speaking Rate Collabora9on/Turn-Taking Voice Quality TEXT Emo9on/Affect Complexity Cohesion Seman9cs Syntac9cs Content EYE TRACKING DATA Fixa9ons Fixa9on Paeern Pupil Dila9on Eye loca9on Aeen9on Head Pose DEPTH CAMERA Gestures Body Posi9on Joint Tracking SMARTPHONES Signg/Standing Loca9on Noise Level LEAP MOTION Gestures Tool Usage Hand Movement Finger Movement EDA WATCH Arousal Cogni9ve Load Body Temperature Gestures Heartrate Stress

EVENT LOGS Ac9vity

OTHER MODALITIES fMRI ELECTROCARDIOLOGY ELECTROENCEPHALOGY ELECTROMYOGRAPHY ELECTROGASTROGRAPHY

slide-9
SLIDE 9

Why use MMLA?

  • Teaching and learning are mul9modal
  • Study and support complex learning

environments

  • See the hard to see
  • Inform design of mul9modal technologies
  • Expand no9ons of learning to non-

tradi9onal modali9es

  • Improve accessibility and inclusivity
  • Triangulate across modali9es
slide-10
SLIDE 10

Why use MMLA?

  • Visualizing/Represen9ng informa9on for

human inference

  • Predic9on of indicators
  • Data-driven interven9ons
  • Evalua9ng conjecture-based learning

designs

slide-11
SLIDE 11

How to use MMLA?

slide-12
SLIDE 12

How to use MMLA?

slide-13
SLIDE 13

How to use MMLA?

DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool

slide-14
SLIDE 14

How to use MMLA?

DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool

slide-15
SLIDE 15

How to use MMLA?

DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool

slide-16
SLIDE 16

How to use MMLA?

DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool

slide-17
SLIDE 17

How to use MMLA?

DATA CAPTURE SOFTWARE Open Social Signal Interpreta9on Lab Streaming Layer Open Pipe Kit Open Broadcaster Sohware Mul9sense (AV Recorder) iMo9ons Aeen9on Tool

slide-18
SLIDE 18

How to use MMLA?

slide-19
SLIDE 19

How to use MMLA?

Pre-processing is important for data synchroniza9on, accoun9ng for individual differences between par9cipants, and gegng data in the appropriate format for data

  • extrac9on. It tends to vary by data type.
slide-20
SLIDE 20

How to use MMLA?

slide-21
SLIDE 21

VIDEO OpenFace (gaze) OpenFace (face recogni9on) Emo9ent FACET Noldus Face Reader OpenCV Affec9va Microsoh Face API Microsoh Emo9on API Intraface Mul9sense Eularian Magnifica9on Code iMo9ons AUDIO HTK Praat OpenSmile OpenEar Covarep ELAN ICSI Diarizer Matlab LIUM Diarizer CMU Sphinx Google ASR AT&T Watson ASR Bing Speech API Audacity Emovoice TEXT Natural Language Toolkit Lightside LIWC Cohmetrix Tone Analyzer Stanford Parser Wordnet Sen9wordnet Mallet Word2Vec

EYE TRACKING DATA PyGaze Ogama EyeTrackingR Tobii SDK SMI SDK iMo9ons Aeen9on Tool Pupil Dila9on

DEPTH CAMERA OpenNUI Kinect for Windows QSRLib Libfreenect

EDA WATCH Ledalab DATA EXTRACTION SOFTWARE Open Social Signal Interpreta9on

slide-22
SLIDE 22

How to use MMLA?

slide-23
SLIDE 23

How to use MMLA?

Human Coders

slide-24
SLIDE 24

Fusion

Let’s say that you want to study engagement and have audio, video, bio-physiological and gesture data available for analysis. How do you use these to compute a measure of engagement?

slide-25
SLIDE 25

How to use MMLA?

slide-26
SLIDE 26
slide-27
SLIDE 27

How to use MMLA?

slide-28
SLIDE 28

Examples

slide-29
SLIDE 29
slide-30
SLIDE 30

Xbox Kinect

  • Audio (Talk)
  • Video (Pose/

Gaze)

  • Gestures (Hand

Movement)

RapidMiner

  • XMeans

Researcher(s)

slide-31
SLIDE 31

Xbox Kinect

  • Audio (Talk)
  • Video (Pose/

Gaze)

  • Gestures (Hand

Movement)

RapidMiner

  • XMeans

Researcher(s) Custom

slide-32
SLIDE 32

Challenges & Future Directions

slide-33
SLIDE 33

Simplifying Data Capture & Analysis

slide-34
SLIDE 34
slide-35
SLIDE 35

Better Visualization/Inference Tools

slide-36
SLIDE 36

Best Practices Around Data Fusion and Data Analysis Pipelines

slide-37
SLIDE 37

Applications

(especially as it relates to inclusive technology and providing feedback to learners and teachers)

slide-38
SLIDE 38

Resources

Mul9modal Learning Analy9cs Special Interest Group - hep://sigmla.org CrossMMLA Workshops EC-TEL 2017, LAK 2018 (tenta9ve) MMLA Workshops @ ICMI, LAK, ICLS (2012 – Present) SOLAR LASI – Mul9modal Learning Analy9cs Tutorial Workshops Journal of Learning Analy9cs Special Sec9on on Mul9modal Learning Analy9cs CIRCL Cyberlearning Report

slide-39
SLIDE 39

Berland, M., Baker, R. S., & Blikstein, P. (2014). Educa9onal Data Mining and Learning Analy9cs: Applica9ons to Construc9onist Research. Technology, Knowledge and Learning, 19(1–2), 205–220. Blikstein, P., & Worsley, M. (2016). Mul9modal Learning Analy9cs and Educa9on Data Mining : using computa9onal technologies to measure complex learning tasks, 3(2), 220–238. heps://doi.org/ hep://dx.doi.org/10.18608/jla.2016.32.11 Fouse, A. S. (2011). ChronoViz : A system for suppor9ng naviga9on of 9me-coded data. Chi, 1–6. heps://doi.org/10.1145/1979742.1979706 Grafsgaard, J. F. (2014). Mul9modal Analysis and Modeling of Nonverbal Behaviors During Tutoring. In Proceedings of the 16th Interna=onal Conference on Mul=modal Interac=on (pp. 404–408). New York, NY, USA: ACM. heps://doi.org/10.1145/2663204.2667611 Leong, C. W., Chen, L., Feng, G., Lee, C. M., & Mulholland, M. (2015). U9lizing depth sensors for analyzing mul9modal presenta9ons: Hardware, sohware and toolkits. ICMI 2015 - Proceedings of the 2015 ACM Interna=onal Conference on Mul=modal Interac=on, (3), 547–556. heps://doi.org/10.1145/2818346.2830605 Luz, S. (2013). Automa9c iden9fica9on of experts and performance predic9on in the mul9modal math data corpus through analysis of speech interac9on. Proceedings of the 15th ACM on Interna=onal Conference on Mul=modal Interac=on - ICMI ’13, 575–582. heps://doi.org/10.1145/2522848.2533788 Morency, L.-P., Oviae, S., Scherer, S., Weibel, N., & Worsley, M. (2013). ICMI 2013 grand challenge workshop on mul9modal learning analy9cs. Proceedings of the 15th ACM on Interna=onal Conference

  • n Mul=modal Interac=on - ICMI ’13, 373–378. heps://doi.org/10.1145/2522848.2534669

Oviae, S., & Cohen, A. (2013). Wrieen and mul9modal representa9ons as predictors of exper9se and problem-solving success in mathema9cs. … 15th ACM on Interna=onal Conference on Mul=modal …, 1–8. Retrieved from hep://dl.acm.org/cita9on.cfm?id=2533793 Scherer, S., Worsley, M., & Morency, L.-P. (2012). 1st interna9onal workshop on mul9modal learning analy9cs. In ICMI (pp. 609–610). Schneider, B., & Blikstein, P. (2015). Unraveling Students’ Interac9on Around a Tangible Interface using Mul9modal Learning Analy9cs. Journal of Educa=onal Data Mining. Spikol, D. (2017). Using Mul9modal Learning Analy9cs to Iden9fy Aspects of Collabora9on in Project-Based Learning Introduc9on PELARS system and context. Cscl, (June), 263–270. heps://doi.org/ 10.22318/cscl2017.37 Thompson, K. (2013). Using micro-paeerns of speech to predict the correctness of answers to mathema9cs problems: an exercise in mul9modal learning analy9cs. … 15th ACM on Interna=onal Conference on Mul=modal …. Retrieved from hep://dl.acm.org/cita9on.cfm?id=2533792 Worsley, M. (2012). Mul9modal Learning Analy9cs - Enabling the future of learning through mul9modal data analysis and interfaces. Interna=onal Conference on Mul=modal Interac=on, 12–15. heps:// doi.org/10.1145/2388676.2388755 Worsley, M., Scherer, S., Morency, L. P., & Blikstein, P. (2016). Exploring Behavior Representa9on for Learning Analy9cs. Proceedings of the 2015 Interna=onal Conference on Mul=modal Interac=on (ICMI), 251–258. Xavier Ochoa, Marcelo Worsley, Katherine Chiluiza, & Saturnino Luz. (2014). MLA’14: Third Mul9modal Learning Analy9cs Workshop and Grand Challenges. Proceedings of the 16th Interna=onal Conference on Mul=modal Interac=on, 531–532. heps://doi.org/10.1145/2663204.2668318

slide-40
SLIDE 40

39

Marcelo Worsley

Assistant Professor Electrical Engineering and Computer Science & Learning Sciences marcelo.worsley@northwestern.edu Marceloworsley.com 9ilt.northwestern.edu