connecting instructional assessment ir data and student
play

Connecting Instructional Assessment, IR Data, and Student Success - PowerPoint PPT Presentation

Connecting Instructional Assessment, IR Data, and Student Success Hannah W hang Sayson, Casey Shapiro, Brit Toven-Lindsey CAI R Annual Conference November 16, 2016 Presentation Overview Introduction to: Classroom Observation Protocol


  1. Connecting Instructional Assessment, IR Data, and Student Success Hannah W hang Sayson, Casey Shapiro, Brit Toven-Lindsey CAI R Annual Conference November 16, 2016

  2. Presentation Overview • Introduction to: – Classroom Observation Protocol for Undergraduate STEM (COPUS) – General Observation Reporting Protocol (GORP) • Case study: UCLA bioinformatics course • Activity and discussion

  3. Classroom Observation Protocol for Undergraduate STEM (COPUS) Protocol developed by researchers at UMaine and UBC to investigate range and frequency of teaching practices in STEM classes • Snapshot of all classroom activities at 2-min intervals – Instructor and student activities – Pre-defined observation codes Smith, M.K., Jones, F.H.M., Gilbert, S.L., & Wieman, C.E. (2013)

  4. Activity Follow-up • Discuss in groups of 2-3 (5 minutes) – Compare observation notes • Large group (3-5 minutes) – How was the coding process? – What did you find after comparing notes?

  5. Benefits & Challenges of COPUS • Benefits – Validity and reliability (IRR) – Can capture a range of instructional styles – Provides detailed info about instructional practices – COPUS data can be used for tenure and promotion, to develop targeted professional development • Challenges – Timing, especially with multiple coders – Need adequate training – Can be difficult to capture everything – Paper coding cumbersome

  6. Generalized Observation Reporting Protocol (GORP) • Developed by researchers at UC Davis to facilitate use of COPUS – User-friendly interface; works on numerous devices – Automatically captures data at 2-min intervals – Allows for multiple coders and data download for inter-rater reliability (IRR) calculations • Tool can be customized for specific activities

  7. Generalized Observation Reporting Protocol (GORP) UC Davis Tools for Evidence-based Action

  8. Example: Introduction to Bioinformatics at UCLA

  9. Introduction to Bioinformatics • Goals and measures for computer science (and STEM) education – Increase engagement • # questions and answers volunteered – Improve learning and academic performance • Exam scores (“Bloomed” for cognitive rigor), final grades – Increase persistence rates, especially among women and URM students • Enrollment snapshots, final grades

  10. Course Timeline Year Major changes in course format 2003 • Bioinformatics offered as standard lecture course 2009 • Incorporate Socratic method, posing questions and soliciting student answers verbally • Switch from “grading on the curve” to grading based on previous year’s distribution 2011 • Incorporate ORCT error discovery learning, enabling each student to answer target problems via laptop or smartphone • Start compiling distinct conceptual errors made by students for each question 2012 • Build ORCT self-assessments based on identification of conceptual errors

  11. Open Response Concept Testing (ORCT) • Developed by UCLA faculty member as active learning tool to support conceptual understanding and reasoning – Interactive online tool – Uncovers instructor and student blind spots in understanding of course concepts – Generates “common errors” that help students identify misunderstandings (error discovery learning) – Used to customize resources and materials that students can use to re-examine and master concept

  12. Open Response Concept Testing (ORCT)

  13. Classroom Observation Data • Course lectures (3 COPUS coded per term) – Recorded lectures: 2008, 2009, 2011, 2013 – Live observations: Fall 2015 • 2 observers per lecture (out of team of 3 researchers) • Code for course-specific interventions – ORCT in lieu of Clickers and experiments/demonstrations • Deal with limitations of lecture recordings – Eliminate codes for instructional activities not “observable” with video: instructor moving around the room, one-on-one conversations, etc. – Primarily track instructor activities since students often out of frame

  14. IRR Calculations: Cohen’s Kappa • Used for qualitative/categorical variables • Adjusted for chance agreement (vs. raw % agreement) • Range: 0-1*, with 1=perfect agreement – Generally, Kappa > 0.70 considered satisfactory – Baseline Kappa = 0.82 for 2013 lectures • Calculated via preformatted Excel workbook for 2 observers – Alternatively via SPSS (crosstabs), Stata (kappa, kap), or SAS (proc freq)

  15. Student Activities in Lecture Bioinformatics 2015, Week 6 Other group activity ORCT: Group ORCT: Individual Answering question Posing question Listening Waiting 0 10 20 30 40 50 60 70 80 90 100 110 Minutes in Class

  16. Instructor Activities in Lecture Bioinformatics 2015, Week 6 Administration Real-time writing Follow-up on ORCT ORCT activity Posing question (non-ORCT) Answering question Lecturing Waiting 0 10 20 30 40 50 60 70 80 90 100 110 Minutes in Class

  17. Instructor Activities Over Time Socratic ORCT

  18. Course Evaluations Too much Workload/Pace Class mean and SD (error bars) on 3-point scale Too slow 2004 2005 2006 2011* 2012 2013 2015 ORCT

  19. Retention Rates (Weeks 1-10), 2003-2015 100.0% 100% 87.8% 85.1% 90% 76.9% 75.0% 80% 70.0% 68.6% 66.7% 70% 60.0% 57.1% 60% 54.5% 51.4% 50.0% 50.0% 50.0% 50.0% 50.0% 50.0% 48.6% 50% 40.0% 40% 30% 20% 10% 0% 2003 2004 2005 2006 2007 2008 2009* 2010 2011* 2012 2013 2014 2015 Undergraduate Graduate

  20. UG Retention Rates (Weeks 1-10) by Gender, 2003-2015 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 2003-2009 2010 2011* 2012 2013 2014 2015 UG Women UG Men UG Total

  21. Grad Retention Rates (Weeks 1-10) by Gender, 2003-2015 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 2003-2009 2010 2011* 2012 2013 2014 2015 Grad Women Grad Men Grad Total

  22. UG Retention Rates (Weeks 3-10 ) by Gender, 2003-2015 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 2003-2008 2009* 2010 2011* 2012 2013 2014 2015 UG Women UG Men UG Total

  23. Grad Retention Rates (Weeks 3-10 ) by Gender, 2003-2015 100% 90% 80% 70% 60% 50% 40% 30% 20% 10% 0% 2003-2008 2009* 2010 2011* 2012 2013 2014 2015 Grad Women Grad Men Grad Total

  24. UG Final Grades, 2003-2015 100% 4.0 90% 3.5 80% 3.0 A/A+ (4.0) 70% A- (3.7) 2.5 60% B+ (3.3) B (3.0) 50% 2.0 B- (2.7) 40% 1.5 C+ (2.3) 30% C (2.0) 1.0 20% C- or Below (1.7) 0.5 Average 10% 0% 0.0

  25. Discussion • What is your institution’s current landscape for assessing (or proposing to assess) teaching & learning? • What types of IR data does your campus use to assess teaching & learning? • How might these tools be used or modified to fit your campus’ assessment needs? – COPUS/GORP (direct observation) – Course evaluations – Application data – Enrollment snapshots – Course grades

  26. Additional Examples of COPUS Research and Funding at UCLA Life Sciences Core Curriculum (NSF) • – How are effective are LS core faculty’s new/more student-centered practices? – Do faculty perceptions of teaching align with observable behaviors in the classroom? • PEERS Undergraduate Research & Mentoring (NSF) – How effective are workshop leaders’ student-centered practices in new math workshops? – Does math workshops’ use of active learning practices impact STEM retention for students in the PEERS program? • Lower Division Physics Courses (OID institutional grant) – How effective is faculty use of active learning pedagogy in making physics lectures/ discussions/labs more inclusive? – Does active learning pedagogy improve student retention and concept mastery in lower division physics courses?

  27. Center for Educational Assessment UCLA Office of Instructional Development Contact: hwhang@oid.ucla.edu ~ UC Davis Tools for Evidence-based Action http: / / t4eba.com R25GM114822

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend