Guides in Program Assessment Carrie Zelna, Ph.D. Director, Office - - PowerPoint PPT Presentation

guides in program assessment
SMART_READER_LITE
LIVE PREVIEW

Guides in Program Assessment Carrie Zelna, Ph.D. Director, Office - - PowerPoint PPT Presentation

Using Rubrics/Scoring Guides in Program Assessment Carrie Zelna, Ph.D. Director, Office of Assessment Associate Vice Provost Division of Academic and Student Affairs Stephany Dunstan, Ph.D. Associate Director, Office of Assessment Division


slide-1
SLIDE 1

Using Rubrics/Scoring Guides in Program Assessment

Carrie Zelna, Ph.D. Director, Office of Assessment Associate Vice Provost Division of Academic and Student Affairs Stephany Dunstan, Ph.D. Associate Director, Office of Assessment Division of Academic and Student Affairs

slide-2
SLIDE 2

Outcomes

  • Identify possible opportunities for using rubric/scoring guides

as an assessment tool at the student level, class/course level,

  • r program level
  • Describe the types of rubrics/scoring guides that can be

applied to student, class/course, or program assessment

  • Identify the steps necessary to apply rubrics/scoring guides

systematically (norming, sampling, analysis)

slide-3
SLIDE 3

Levels of assessment

  • Individual/student level
  • Class/course level
  • Program/Curriculum level
slide-4
SLIDE 4

Four Steps of Assessment

Four Steps of Assessment

Linda Suskie 2009 Assessing Student Learning: A common sense guide. 2nd edition. Jossey-Bass.

1. Establish Learning Goals (Plan) 2. Providing Learning Opportunities (Act) 3. Assess Student Learning (Observe) 4. Use the results (Reflect)

Linda Suskie email on 4/4/2008 to the Assess listserve: “….understand that assessment is action research, not experimental research. While it is systematic, action research is context-specific, informal, and designed to inform individual practice. As such, it doesn't have the precision, rigor, or generalizability of experimental research. “

  • P. Steinke & C. Zelna
slide-5
SLIDE 5

Rubric: Definition and Purpose

  • Rubric: “a scoring tool that lays out the specific expectations for an

assignment” (Stevens & Levi, 2005, p. 3)

  • It is a way of organizing criteria to systematically determine if the outcome is

met based on data gathered through papers, observation, document analysis,

  • r some other appropriate method.
  • When you review the data in the aggregate, a rubric can help identify

patterns of strengths and weaknesses that might allow for enhancements to the program.

slide-6
SLIDE 6

Constructed Response

  • Short-Answer Essay Questions
  • Concept Maps
  • Identifying Themes
  • Making Predictions
  • Summaries
  • Explain Your Solution

Course Assessment: Rubrics/Scoring Guides http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm

slide-7
SLIDE 7

Product/Performance

“...reveals their understanding of certain concepts and skills and/or their ability to apply, analyze, synthesize or evaluate those concepts and skills” *

Research Paper Capstone Project Article Reviews Film Analysis Case Study Error Analysis Panel Discussion Fishbowl Discussion Oral Presentations

Course Assessment: Rubrics/Scoring Guides * http://jfmueller.faculty.noctrl.edu/toolbox/tasks.htm

slide-8
SLIDE 8

Types of Rubrics

1. Holistic 2. Check-list 3. Rating Scale 4. Descriptive 5. Structured Observation Guide

slide-9
SLIDE 9

Examples

slide-10
SLIDE 10

Holistic (Suskie, 2009)

slide-11
SLIDE 11

Checklist Rubric (Suskie, 2009)

slide-12
SLIDE 12

Rating Scale Rubric (Suskie, 2009)

slide-13
SLIDE 13

Descriptive Rubric

(Suskie, 2009)

slide-14
SLIDE 14

Structured Observation Guide (Suskie, 2009)

slide-15
SLIDE 15

Things to consider

  • Levels of Assessment
  • Individual
  • Alignment from outcome to assignment to rubric
  • Additional information to be measured (format, program outcomes,

etc.)

  • Testing the rubric
  • Class/Course
  • All of the above
  • Sampling
  • Analysis: Aggregate data
  • Program
  • All of the above
  • Norming/multiple raters
slide-16
SLIDE 16

Testing Your Rubric/Scoring Guide

  • Metarubric: use a to review your work
  • Peer review: ask one of your colleagues to review the rubric

and provide feedback on content

  • Student review: ask several students review the rubric if

appropriate (students in a class may help you create it)

  • Test with products: use student work to test the rubric once

you feel it is ready (3-5?)

slide-17
SLIDE 17

Sampling: Being Systematic

  • Should you sample at all or review all the products?
  • Mary Allen- Rule of Thumb_ 50 to 75 is sufficient for assessment
  • Consider the attitudes of the faculty towards data:
  • Quantitative approaches
  • Qualitative approaches

**The Office of Assessment can help choose random students for your assessment.

slide-18
SLIDE 18

Analysis

  • Individual vs. Aggregate Scores
  • Average Score (Mean) By Dimension and Total
  • Total Score: Total scores may be reviewed to get a big picture
  • Dimension: Dimension scores to look for patterns
  • Frequency Distributions
  • Scale: Frequencies by scale to get a clearer understanding of the

data

slide-19
SLIDE 19

Scoring the Data

ID# Class Age Gender Paper Length Total Separation/ Objectivity Dissonance Understanding/ Change in Perspective Self- Perception Resolution Application/ Verification totals A FR 19 F 5 18 3 3 3 3 3 3 18 B SR 21 M 3 17 3 3 3 3 3 2 17 C FR 18 F 7 16 3 3 3 2 2 3 16 D SR 21 M 5 16 3 3 3 2 3 2 16 E SO 19 F 9 15 2 3 3 2 2 3 15 F FR 18 M 3 14 3 3 3 2 3 14 G SO 20 M 3 14 3 3 3 3 2 14 H SO 19 M 5 13 2 2 3 2 2 2 13 I FR 18 M 8 13 3 3 3 2 2 13 J JR 20 F 5 13 2 2 2 2 3 2 13 K SO 20 M 5 13 3 3 2 2 2 1 13 L FR 18 M 7 13 2 3 2 2 2 2 13 M JR 20 F 3 11 3 3 3 2 11 N FR 18 F 5 10 2 2 2 2 2 10 O SO 22 M 4 10 2 3 2 2 2 11 P FR 18 F 6 10 2 3 1 2 1 1 10 Q FR 19 M 9 9 2 2 1 2 1 1 9 R FR 18 M 3 9 2 3 2 1 1 9 S FR 18 M 15 7 2 1 1 1 1 1 7 T SO 20 F 4 7 1 2 1 2 1 7 Av erage Score 2.5 2.8 2.4 1.8 2.2 1.4 13.1

slide-20
SLIDE 20

2 4 6 8 10 12 14 16 Separation/ Objectivity Dissonance Understanding/ Change in Perspective Self-Perception Resolution Application/ Verification Scale: 3 Scale: 2 Scale: 1 Scale: 0

Frequencies

Scale: 3 Scale:2 Scale: 1 Scale: 0 Separation/ Objectivity 9 10 1 2 Dissonance 14 5 1 2 Understanding/ Change in Perspective 10 6 3 3 Self-Perception 2 13 3 4 Resolution 6 10 4 2 Application/ Verification 3 6 5 8

Frequencies

slide-21
SLIDE 21

Example: University of Virginia

slide-22
SLIDE 22

Other “Scoring” Options

  • Structured Observation Guide:
  • Thematic approach-using qualitative coding to determine what

you are seeing.

slide-23
SLIDE 23

Norming with Multiple Raters

  • Will multiple reviewers look at each product?
  • Consider time frame (when must all products be scored)
  • Will raters score together in the same physical location?
  • Spend time walking through the rubric/scoring guide as a

group.

  • Review 1 product individually, then compare responses as a

group- share why scores were chosen, discuss, reach consensus.

  • Repeat process above as needed until you feel comfortable

that you are on the same page (5 to 10)

  • This could result in additional changes to the rubric/scoring guide in

some cases

  • Consider doing this throughout the process to ensure that you are

not drifting (“recalibrate”)

slide-24
SLIDE 24

Creating a Rubric

1. What are you trying to assess? (Outcomes) 2. What does the outcome (s) ‘look‘ like? 3. How can you group the item/elements/criteria/dimensions

  • f the outcome?

4. List these elements with brief description of the best case scenario 5. How will the student demonstrate this learning? Does the assignment align with the list of elements? 6. What else needs to be in the rubric to grade the work? 7. Do you need specific information for the curriculum?

slide-25
SLIDE 25

References/Resources

  • AACU Rubrics:

http://www.aacu.org/value/rubrics/index_p.cfm?CFID=37 317515&CFTOKEN=54026278

  • Allen, Mary J. (2006). Assessing General Education
  • Programs. Bolton, MA: Anker Publishing Co.
  • Stevens and Antonia (2005). Introduction to Rubrics.

Sterling, VA:Stylus Publishing, LLC.

  • Suskie, L (2009). Assessing Student Learning: A

common sense guide. San Francisco, CA: Jossey-Bass.