Connecting Instructional Assessment, IR Data, and Student Success - - PowerPoint PPT Presentation

connecting instructional assessment ir data and student
SMART_READER_LITE
LIVE PREVIEW

Connecting Instructional Assessment, IR Data, and Student Success - - PowerPoint PPT Presentation

Connecting Instructional Assessment, IR Data, and Student Success Hannah W hang Sayson, Casey Shapiro, Brit Toven-Lindsey CAI R Annual Conference November 16, 2016 Presentation Overview Introduction to: Classroom Observation Protocol


slide-1
SLIDE 1

Connecting Instructional Assessment, IR Data, and Student Success

Hannah W hang Sayson, Casey Shapiro, Brit Toven-Lindsey

CAI R Annual Conference November 16, 2016

slide-2
SLIDE 2

Presentation Overview

  • Introduction to:

– Classroom Observation Protocol for Undergraduate STEM (COPUS) – General Observation Reporting Protocol (GORP)

  • Case study: UCLA bioinformatics

course

  • Activity and discussion
slide-3
SLIDE 3

Classroom Observation Protocol for Undergraduate STEM (COPUS)

Protocol developed by researchers at UMaine and UBC to investigate range and frequency of teaching practices in STEM classes

  • Snapshot of all classroom activities at 2-min

intervals

– Instructor and student activities – Pre-defined observation codes

Smith, M.K., Jones, F.H.M., Gilbert, S.L., & Wieman, C.E. (2013)

slide-4
SLIDE 4

Activity Follow-up

  • Discuss in groups of 2-3 (5 minutes)

– Compare observation notes

  • Large group (3-5 minutes)

– How was the coding process? – What did you find after comparing notes?

slide-5
SLIDE 5

Benefits & Challenges of COPUS

  • Benefits

– Validity and reliability (IRR) – Can capture a range of instructional styles – Provides detailed info about instructional practices – COPUS data can be used for tenure and promotion, to develop targeted professional development

  • Challenges

– Timing, especially with multiple coders – Need adequate training – Can be difficult to capture everything – Paper coding cumbersome

slide-6
SLIDE 6

Generalized Observation Reporting Protocol (GORP)

  • Developed by researchers at UC Davis to

facilitate use of COPUS

– User-friendly interface; works on numerous devices – Automatically captures data at 2-min intervals – Allows for multiple coders and data download for inter-rater reliability (IRR) calculations

  • Tool can be customized for specific activities
slide-7
SLIDE 7

Generalized Observation Reporting Protocol (GORP)

UC Davis Tools for Evidence-based Action

slide-8
SLIDE 8

Example: Introduction to Bioinformatics at UCLA

slide-9
SLIDE 9

Introduction to Bioinformatics

  • Goals and measures for computer science

(and STEM) education

– Increase engagement

  • # questions and answers volunteered

– Improve learning and academic performance

  • Exam scores (“Bloomed” for cognitive rigor), final

grades

– Increase persistence rates, especially among women and URM students

  • Enrollment snapshots, final grades
slide-10
SLIDE 10

Course Timeline

Year Major changes in course format 2003

  • Bioinformatics offered as standard lecture course

2009

  • Incorporate Socratic method, posing questions and

soliciting student answers verbally

  • Switch from “grading on the curve” to grading based
  • n previous year’s distribution

2011

  • Incorporate ORCT error discovery learning, enabling

each student to answer target problems via laptop or smartphone

  • Start compiling distinct conceptual errors made by

students for each question 2012

  • Build ORCT self-assessments based on identification
  • f conceptual errors
slide-11
SLIDE 11

Open Response Concept Testing (ORCT)

  • Developed by UCLA faculty member as active

learning tool to support conceptual understanding and reasoning

– Interactive online tool – Uncovers instructor and student blind spots in understanding of course concepts – Generates “common errors” that help students identify misunderstandings (error discovery learning) – Used to customize resources and materials that students can use to re-examine and master concept

slide-12
SLIDE 12

Open Response Concept Testing (ORCT)

slide-13
SLIDE 13

Classroom Observation Data

  • Course lectures (3 COPUS coded per term)

– Recorded lectures: 2008, 2009, 2011, 2013 – Live observations: Fall 2015

  • 2 observers per lecture (out of team of 3 researchers)
  • Code for course-specific interventions

– ORCT in lieu of Clickers and experiments/demonstrations

  • Deal with limitations of lecture recordings

– Eliminate codes for instructional activities not “observable” with video: instructor moving around the room, one-on-one conversations, etc. – Primarily track instructor activities since students often out

  • f frame
slide-14
SLIDE 14

IRR Calculations: Cohen’s Kappa

  • Used for qualitative/categorical variables
  • Adjusted for chance agreement (vs. raw %

agreement)

  • Range: 0-1*, with 1=perfect agreement

– Generally, Kappa > 0.70 considered satisfactory – Baseline Kappa= 0.82 for 2013 lectures

  • Calculated via preformatted Excel workbook for 2
  • bservers

– Alternatively via SPSS (crosstabs), Stata (kappa, kap),

  • r SAS (proc freq)
slide-15
SLIDE 15

Student Activities in Lecture

10 20 30 40 50 60 70 80 90 100 110

Minutes in Class

Bioinformatics 2015, Week 6

Other group activity ORCT: Group ORCT: Individual Answering question Posing question Listening Waiting

slide-16
SLIDE 16

Instructor Activities in Lecture

10 20 30 40 50 60 70 80 90 100 110

Minutes in Class

Bioinformatics 2015, Week 6

Administration Real-time writing Follow-up on ORCT ORCT activity Posing question (non-ORCT) Answering question Lecturing Waiting

slide-17
SLIDE 17

Instructor Activities Over Time

Socratic ORCT

slide-18
SLIDE 18

Course Evaluations

2004 2005 2006 2011* 2012 2013 2015 ORCT

Class mean and SD (error bars) on 3-point scale

Workload/Pace Too much Too slow

slide-19
SLIDE 19

Retention Rates (Weeks 1-10), 2003-2015

48.6% 54.5% 60.0% 50.0% 50.0% 50.0% 100.0% 87.8% 85.1% 76.9% 57.1% 40.0% 66.7% 50.0% 50.0% 50.0% 51.4% 68.6% 75.0% 70.0% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 2003 2004 2005 2006 2007 2008 2009* 2010 2011* 2012 2013 2014 2015 Undergraduate Graduate

slide-20
SLIDE 20

UG Retention Rates (Weeks 1-10) by Gender, 2003-2015

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 2003-2009 2010 2011* 2012 2013 2014 2015 UG Women UG Men UG Total

slide-21
SLIDE 21

Grad Retention Rates (Weeks 1-10) by Gender, 2003-2015

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 2003-2009 2010 2011* 2012 2013 2014 2015 Grad Women Grad Men Grad Total

slide-22
SLIDE 22

UG Retention Rates (Weeks 3-10) by Gender, 2003-2015

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 2003-2008 2009* 2010 2011* 2012 2013 2014 2015 UG Women UG Men UG Total

slide-23
SLIDE 23

Grad Retention Rates (Weeks 3-10) by Gender, 2003-2015

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% 2003-2008 2009* 2010 2011* 2012 2013 2014 2015 Grad Women Grad Men Grad Total

slide-24
SLIDE 24

UG Final Grades, 2003-2015

0.0 0.5 1.0 1.5 2.0 2.5 3.0 3.5 4.0 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% A/A+ (4.0) A- (3.7) B+ (3.3) B (3.0) B- (2.7) C+ (2.3) C (2.0) C- or Below (1.7) Average

slide-25
SLIDE 25

Discussion

  • What is your institution’s current landscape for

assessing (or proposing to assess) teaching & learning?

  • What types of IR data does your campus use to assess

teaching & learning?

  • How might these tools be used or modified to fit your

campus’ assessment needs?

– COPUS/GORP (direct observation) – Course evaluations – Application data – Enrollment snapshots – Course grades

slide-26
SLIDE 26

Additional Examples of COPUS Research and Funding at UCLA

  • Life Sciences Core Curriculum (NSF)

– How are effective are LS core faculty’s new/more student-centered practices? – Do faculty perceptions of teaching align with observable behaviors in the classroom?

  • PEERS Undergraduate Research & Mentoring (NSF)

– How effective are workshop leaders’ student-centered practices in new math workshops? – Does math workshops’ use of active learning practices impact STEM retention for students in the PEERS program?

  • Lower Division Physics Courses (OID institutional grant)

– How effective is faculty use of active learning pedagogy in making physics lectures/ discussions/labs more inclusive? – Does active learning pedagogy improve student retention and concept mastery in lower division physics courses?

slide-27
SLIDE 27

Center for Educational Assessment UCLA Office of Instructional Development Contact: hwhang@oid.ucla.edu ~ UC Davis Tools for Evidence-based Action http: / / t4eba.com

R25GM114822