Assessing the Reliability of a Problem-Solving Rubric when using - - PowerPoint PPT Presentation

β–Ά
assessing the reliability of a problem solving rubric
SMART_READER_LITE
LIVE PREVIEW

Assessing the Reliability of a Problem-Solving Rubric when using - - PowerPoint PPT Presentation

Assessing the Reliability of a Problem-Solving Rubric when using Multiple Raters T. Ryan Duckett, Uchenna Asogwa, Matthew W. Liberatore, Gale Mentzer, & Amanda Malefyt timothy.duckett@utoledo.edu


slide-1
SLIDE 1

Assessing the Reliability of a Problem-Solving Rubric when using Multiple Raters

  • T. Ryan Duckett, Uchenna Asogwa, Matthew W. Liberatore,

Gale Mentzer, & Amanda Malefyt

timothy.duckett@utoledo.edu http://www.utoledo.edu/engineering/chemical-engineering/liberatore/ June 2019

slide-2
SLIDE 2

Conceptual framework

2

Student knowledge of engineering concepts Rater-mediated assessment of student ability

Items Construct domains Rating scales Rubric fidelity

slide-3
SLIDE 3

Study design

3

# Raters: Participant N: Problem Type: N/A 113 (39% female) 2 MW schools Undergrad MEB Traditional Innovative 4 70 Traditional Innovative 5 20 Traditional Data Collected Full rating plan Iterative, inter-rater reliability study

slide-4
SLIDE 4

Typical homework problem

4

slide-5
SLIDE 5

Grigg, S. J. & Benson, L. European Journal of Engineering Education, 2014. 39(6): 617-635.

Established rating tool: PROCESS

5

Problem-solving domain

Problem definition Represent the problem Organize information Calculations Solution completion Solution accuracy

slide-6
SLIDE 6

Grigg, S. J. & Benson, L. European Journal of Engineering Education, 2014. 39(6): 617-635.

Detailed PROCESS rubric

6

Problem- solving domain Tasks performed

Level of completion

Missing Inadequate Adequate Accurate Source

  • f Error

0 points 1 point 2 point 3 point

Problem definition Identify unknown

Did not explicitly identify problem Completed few problem/system tasks with many errors Completed most problem/system desks with few errors Clearly and correctly identified and defined problem/system

slide-7
SLIDE 7

Accuracy in assessment

7

Bn

  • Cj

Di

  • Fk
  • Bn is the ability of student n.

Di is the difficulty of item i. Cj is the severity of judge j. Fk is the extra difficulty overcome in being observed at the level of category k, relative to category k-1.

log π‘„π‘œπ‘—π‘˜π‘˜π‘™ π‘„π‘œπ‘—π‘˜π‘™βˆ’1 =

slide-8
SLIDE 8

Principle of Invariance

Rasch fundamentals of measurement

8

Monotonicity Unidimensionality Local independence

slide-9
SLIDE 9

Overall Measure

  • Rater Severity -

+ Student Ability +

  • PROCESS Item

Difficulty - Rating Category

100 90 80 70 60 50 40 30 20 10

Rater 3 Rater 5 Rater 2 Rater 1 Rater 4 A B C D E F G H I J K L M N O P Q R S T Solution Accuracy Identify Organize Allocate Represent Solution Completion (3)

  • 2
  • 1
  • (0)

Rasch creates common measure

9

slide-10
SLIDE 10

Measuring rater bias

10

slide-11
SLIDE 11

Raters discuss similar scores

11

Scores for Student E:

Rater Problem definition Represent problem Organize knowledge Calculate Solution completion Solution accuracy Rater 1 3 3 3 3 3 1 Rater 2 3 2 3 3 3 1 Rater 3 3 3 3 3 3 3 Rater 4 3 3 3 3 3 1 Rater 5 3 3 3 2 2 1

slide-12
SLIDE 12

Raters identify differences

12

Scores for Student M:

Rater Problem definition Represent problem Organize knowledge Calculate Solution completion Solution accuracy Rater 1 3 3 3 2 3 1 Rater 2 3 2 1 1 2 1 Rater 3 3 3 3 3 2 3 Rater 4 2 3 3 2 1 1 Rater 5 3 3 1 3 2 1

slide-13
SLIDE 13

Improving rater agreement

13

Level of agreement

Weak Moderate

slide-14
SLIDE 14

Iterative reliability evaluation

Conclusions

14

Accuracy of assessment Identify source of measurement errors Greater adherence to measurement principles

slide-15
SLIDE 15

Katherine Roach, Caleb Sims, Lindsey Stevens, countless TAs

University of Toledo IRB protocols 202214

Thank you and…

DUE-1712186

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.