personal analytics getting off the deficit path
play

Personal Analytics: Getting off the deficit path Professor Gregor - PowerPoint PPT Presentation

Personal Analytics: Getting off the deficit path Professor Gregor Kennedy The University of Melbourne What are Personal Analytics? http://birdsontheblog.co.uk/getting-fitter-and-healthier-with-the-fitbit/ What are Personal Analytics?


  1. Personal Analytics: Getting off the deficit path Professor Gregor Kennedy The University of Melbourne

  2. What are Personal Analytics? http://birdsontheblog.co.uk/getting-fitter-and-healthier-with-the-fitbit/

  3. What are Personal Analytics? https://gigaom.com/2011/11/07/is-klout-crossing-the-line-when-it-comes-to-privacy/

  4. What are Personal Analytics?

  5. Big Data = Analytics https://www.linkedin.com/today/post/article/20140312180810-246665791-the-future-of-big-data-and-analytics

  6. Learning Analytics is all the Rage

  7. With Great Promise • Detect potential “at risk” students • Formative and summative feedback to students on their learning processes and outcomes • Assist with evidence-based resource allocation • Improve institutional decision-making and responsiveness to known challenges • Promote a shared understanding of institutional successes and challenges • Academic research and development (Long & Siemens, 2011)

  8. Defining Analytics Learning Analytics is the measurement, collection, analysis and reporting of data about learners and their contexts for the purposes of understanding and optimizing learning, and the environments in which it occurs. Learners, Educators, Teachers Academic Analytics is the improvement of organizational processes, workflows, resource allocation, and institutional measurement through the use of learner, academic, and institutional data. THIS BIT IS Managers, Administrators, Funders NOT NEW Society for Learning Analytics Research (2011)

  9. Two Traditions Intelligent Interactivity Tutoring Research Systems

  10. Two Traditions Interactivity Research

  11. Taxonomies of Interaction Interactivity Research Taxonomies and Classifications e.g. Thompson & Jorgenson (1989) Reactive Interactive Proactive

  12. Taxonomies of Interaction Interactivity Research Taxonomies and Classifications e.g. Schwier & Misanchuk (1993) Reactive Proactive Mutual

  13. Concerns about the past Interactivity Research • Often use fairly raw metrics, simple student measures and inputs (e.g. MCQs, simple access counts). • Largely descriptive (useful) but often fails to complete the feedback loop to students and/or teachers.

  14. Two Traditions Intelligent Interactivity Tutoring Research Systems

  15. Two Traditions Intelligent Tutoring Systems Feedback Student Pedagogical Model Model Domain Knowledge

  16. Two Traditions Give a Hint Intelligent Tutoring Systems Flag the Error Explain the Error Show a worked example (Mike Timms)

  17. Concerns about the past Intelligent Tutoring Systems • “ITS were recognised as narrow and brittle” (Cumming & McDougall, 2000) • … heavily reliant on educational programs and applications that had defined or discrete stages and steps. • They were often tied to a program and were not generalisable.

  18. Two Traditions Combined Intelligent Interactivity Tutoring Research Systems Assess Smart Diagnose Student System Recognise Personal Adaptive

  19. Two Traditions Intelligent Interactivity Tutoring Research Systems Drill and Practice Procedural Simulation Conceptual Simulation

  20. Drill and Practice A A A A A X B B B B B C C C C C Content Content Content Student Path Content Content Feedback Content

  21. Procedural Simulation X Student Path Implicit Feedback Explicit Feedback

  22. Conceptual Simulation X Student Path Implicit Feedback Explicit Feedback

  23. Back to (Today’s) Analytics

  24. How Today’s Analytics are Used Detect “At Risk” ✓ Students for Retention Teaching & Learning Research, Evaluation ✓ & QA Personalised or Adaptive Feedback for Learning ✓

  25. #1 … “At Risk” Analytics • Purdue University’s “Signals” • Used to predict students who are “at risk” • Individual student risk is predicted using an algorithm based on data from four sources: − Performance … “points earned in a course to date” − Effort … interaction with the learning management system as compared to peers − Academic history … e.g. GPA, prior academic history − Student characteristics … e.g. residency, age (Arnold & Pistilli, 2012)

  26. “At Risk” Analytics • Post to students’ LMS • Email or text students • Refer them to an advisor • Call for a chat (Arnold & Pistilli, 2012)

  27. How Today’s Analytics are Used

  28. Analytics = Diagnosing Deficit Path The field of educational technology has always been interested in using students’ digital traces to assess and diagnose when they move away from preferred learning pathways. Assess, diagnose and recognise “deficit” Feedback X Student Path Preferred Parameters or Pathway

  29. Deficit Pathways This approach has useful pedagogical applications … but Macro: Attrition Micro: Drill & Practice

  30. So what? Is this a problem?

  31. The Promise of Learning Analytics A core promise of learning analytics is to improving students’ micro learning processes in order to enhance their learning outcomes How can we … • harness different data analysis techniques • for the provision of more meaningful feedback • to students on their learning processes • in real time • for genuinely personlised learning environments?

  32. Getting off the Deficit Path From Personal Deficit Analytics … … to Personal Learning Analytics

  33. Example 1: Surgical Skills Simulation James Bailey Professor, Computing & Information Systems Ioanna Ioannou Research Fellow, Otolaryngology Stephen O'Leary Professor, Otolaryngology Patorn Piromchai PhD Student, Otolaryngology Sudathi Wijewickrema Research Fellow, Otolaryngology Yun Zhou PhD Student, Computing & Information Systems

  34. Example 1: Surgical Skills Simulation O'Leary, S., et al. (2008). Validation of a networked virtual reality simulation of temporal bone surgery. The Laryngoscope , 118 (6), 1040-1046.

  35. Metrics from the Simulator • Tool position, orientation and force metrics - e.g. current force applied by the drill • Burr metrics - e.g. radius of the current burr • Anatomical structure metrics - e.g. distance of the drill tip to the closest point of one of three key anatomical structures • Bone specimen metrics - e.g. rotation of the bone ----- 15 records of 48 metrics generated per second -----

  36. A Key Metric: Stroke • A sequence of points containing a continuous drilling motion • The end of a stroke is reached - when drilling ceases; or - when there is an abrupt change in the direction of drilling • Once a way of identifying strokes has been determined a range of “stroke metrics” can be calculated from the data stream output by the simulator (e.g. stroke duration, stroke length, average stroke speed, minimum distance of stroke to structures, etc.) (Hall, Rathod, et al., 2008)

  37. Data Mining for Personal Feedback • We needed to provide personalised feedback to trainees across multiple dimensions or features in an open, complex, procedural simulation. • Not just deficit feedback about manifest error or procedural stage - “You hit the facial nerve” - “You should have completed X before Y” • For example: - force used - stroke length - stroke smoothness - distance to critical structures, etc.

  38. Data Mining for Personal Feedback • Prototype 1: Hidden Markov Models built to discriminate patterns of novice and expert behaviour on a single association rule. • Prototype 2: A range of analysis techniques used to develop models to provide feedback on multiple features: - A random forest model to determine expert/novice behaviour - Nearest neighbour techniques along with a random forest model to generate feedback in the case of novice behaviour - An independent feedback system (application) was built

  39. A Personal Feedback System Simulator Metrics Simulator Stroke Detector Proximity Triggers Stroke Metrics Feedback Feedback Parser Technique Feedback Stroke Metrics Feedback Generator

  40. A Personal Feedback System

  41. Feedback System Test • 24 medical students - 12 were provided with automated feedback - 12 were not • Knowledge of anatomy but not surgery; video tutorial of surgery and simulator familiarisation. • Two group comparison of students’ performance on a cortical mastoidectomy - Effectiveness of technique feedback - Accuracy of feedback - Usability of system

  42. Effectiveness of Technique % of Expert Stokes With Without Feedback Feedback M (SD) M (SD) F p 61.59 (16.19) 38.86 (13.11) 14.29 <.001

  43. Effectiveness of Technique

  44. Accuracy of Feedback • A surgeon undertook a post hoc analysis of the feedback provided by the system - False Positives: feedback was provided when stroke technique was acceptable - False Negatives: feedback was not provided when technique was unacceptable. - Wrong Feedback: participants’ technique was accurately classified as “trainee” but the content of the feedback was inaccurate.

  45. Accuracy of Feedback # of Feedback Percentage Messages (of Total) False Positives 39 6.8% False Negatives 69 11.4% Wrong Feedback 52 9.0% Total Feedback 576

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend