VALUE: Lessons Learned Kate McConnell & Erin Horan Association - - PowerPoint PPT Presentation

value lessons learned
SMART_READER_LITE
LIVE PREVIEW

VALUE: Lessons Learned Kate McConnell & Erin Horan Association - - PowerPoint PPT Presentation

VALUE: Lessons Learned Kate McConnell & Erin Horan Association of American Colleges and Universities Todays session Overview of the MSC Scoring and reporting Lessons learned as related to validity Unintended consequences?


slide-1
SLIDE 1

VALUE: Lessons Learned

Kate McConnell & Erin Horan Association of American Colleges and Universities

slide-2
SLIDE 2

Today’s session

  • Overview of the MSC
  • Scoring and reporting
  • Lessons learned as related to validity
  • Unintended consequences? Faculty development
  • Not as much on state and federal policies
slide-3
SLIDE 3
slide-4
SLIDE 4
slide-5
SLIDE 5
slide-6
SLIDE 6
slide-7
SLIDE 7

Terminology

  • VALUE - Valid Assessment of Learning in Undergraduate Education
  • SHEEO - State Higher Education Executive Officers Association
  • MSC - Multi State Collaborative
slide-8
SLIDE 8
slide-9
SLIDE 9
slide-10
SLIDE 10

VALUE Approach to Assessment

slide-11
SLIDE 11

http://www.aacu.org/OnSolidGroundVALUE

slide-12
SLIDE 12

Lessons Learned from Refinement Year Surveys

  • Faculty viewed rubrics as valid constructs of the learning outcomes
  • It is important to have data analysis in mind before collecting data

–Output as numbered categories rather than long, wordy string variables (e.g., state- coding systems already exist, term) –Be sure output file is something you can work with (csv, no random spacing in variable names) –Have data entered with choices rather than writing out the institution name (small typos make a mess)

  • It is important to provide models for reporting and displaying data

–e.g., SHEEO provided results for each state in the MSC

slide-13
SLIDE 13

VALUE Initiative Results for the Refinement Year

  • Includes all Institutions- Public and Private
  • 75% Completion

– 2-year institutions = 45+ Credit Hours – 4-year institutions = 90+ Credit Hours

  • Critical Thinking: 5 Dimensions

– 2-year institutions, 45+ credit hours: 1,283 Pieces of student work – 4-year institutions, 90+ credit hours: 2,006 Pieces of student work

slide-14
SLIDE 14

VALUE Initiative Results for the Refinement Year 75% Completion

slide-15
SLIDE 15

So what’s missing?

  • Validity

– Rubric design – Rubric application, score interpretation, and use

slide-16
SLIDE 16

Standards for Test Design and Development (ch 4)

  • 1. Standards for Test Specifications
  • 2. Standards for Item Development

and Review

  • 3. Standards for Developing Test

Administration and Scoring Procedures and Materials

  • 4. Standards for Test Revisions
slide-17
SLIDE 17

VALUE Timeline

  • When were the VALUE rubrics released?
slide-18
SLIDE 18

VALUE Timeline

  • 2008 Rubric development with VALUE Partner Campuses
  • 2009 National Review Panel (Rhodes, 2011)
  • 2009 Initial release of VALUE rubrics
  • 2010 National Inter-Rater Reliability Study (Finley, 2011)
  • 2011 Case studies of institutional use (Finley and Rhodes, 2013)
  • 2014-2017 MSC
slide-19
SLIDE 19
  • 1. Standards for Test Specification
  • All have the same structure
slide-20
SLIDE 20
slide-21
SLIDE 21
slide-22
SLIDE 22
  • 1. Standards for Test Specification

Ease of use, instructions given by test administrators

  • All have the same structure
  • AAC&U specifies they should be used when scoring for assessment

not for grading

  • Leadership campuses tested VALUE rubrics for ease of use
slide-23
SLIDE 23
  • 2. Standards for Item Development and Review
  • Documentation of rubric development process- teams, advisory

board, testing with partner campuses, at least three rounds of drafting (Rhodes, 2009)

  • National Review Panel to test rubrics before release- .8 reliability

without training (Rhodes, 2011)

  • MSC showed faculty found rubrics to encompass key elements of

each learning outcome (McConnell and Rhodes, 2017)

  • Assignment design
slide-24
SLIDE 24
  • 3. Standards for Developing Test Administration

and Scoring Procedures and Materials

AAC&U offers recommendations related to

  • training and scoring (Rhodes and Finley, 2013)
  • Presenting and reporting data (McConnell and Rhodes, 2017)
slide-25
SLIDE 25
  • 4. Standards for Test Revisions
  • Future work: assignment (re)design, rubric revisions
slide-26
SLIDE 26

Before moving on to the exciting extra effects…

  • Questions related to validity?
  • Questions for Terry?
slide-27
SLIDE 27

How the rubrics encourage support for assessment

  • What diminishes campus support
  • How to encourage support
  • How the VALUE approach accomplishes this
slide-28
SLIDE 28

How to encourage support

  • Expectancy-Value Theory of motivation

– Expectancy- ability to accomplish task – Value- perceived importance – Cost- sacrifice MacDonald, S. K., Williams, L. M., Lazowski, R. A., Horst, S. J., & Barron, K. E. (2014). Faculty attitudes toward general education Assessment: A qualitative study about their motivation. Research & Practice in Assessment, 9.

slide-29
SLIDE 29

What diminishes campus support?

  • Imposed by external sources
  • Fail to understand purpose
  • Threat academic freedom
  • Additional responsibilities with no incentives
  • Disconnect to everyday classroom activity
slide-30
SLIDE 30

How to encourage support

  • Imposed by external sources
  • Fail to understand purpose
  • Threat academic freedom
  • Additional responsibilities with no

incentives

  • Disconnect to everyday classroom

activity

  • Involve faculty throughout the

process

  • Portray an intrinsic desire to learn

from assessments

slide-31
SLIDE 31

How to encourage support

  • Imposed by external sources
  • Fail to understand purpose
  • Threat academic freedom
  • Additional responsibilities with no

incentives

  • Disconnect to everyday classroom

activity

  • Invest in training
  • Provide ongoing support
  • Create usable, digestive reports
slide-32
SLIDE 32

How to encourage support

  • Imposed by external sources
  • Fail to understand purpose
  • Threat academic freedom
  • Additional responsibilities with no

incentives

  • Disconnect to everyday classroom

activity

  • Use faculty created, course-

embedded assessments

slide-33
SLIDE 33

How to encourage support

  • Imposed by external sources
  • Fail to understand purpose
  • Threat academic freedom
  • Additional responsibilities with no

incentives

  • Disconnect to everyday classroom

activity

  • Count assessment towards

scholarship

  • Offer payment for trainings
slide-34
SLIDE 34

How to encourage support

  • Imposed by external sources
  • Fail to understand purpose
  • Threat academic freedom
  • Additional responsibilities with no

incentives

  • Disconnect to everyday classroom

activity

  • Involve faculty by including their
  • wn disciplinary interests
slide-35
SLIDE 35

How to encourage support

  • Involve faculty throughout the process
  • Portray an intrinsic desire to learn from assessments
  • Invest in training
  • Provide ongoing support
  • Create usable, digestive reports
  • Use faculty created, course-embedded assessments
  • Count assessment towards scholarship
  • Offer payment for trainings
  • Involve faculty by including their own disciplinary interests
slide-36
SLIDE 36

Professional Development

  • Feedback from scorers summer 2017, Refinement Year
slide-37
SLIDE 37

Professional Development

“I found this training to be very good in terms of my professional development as a professor. I scored papers from many disciplines as well as my own and can now see how to work more closely with my students to further assist in getting back work that I expect from them.”

slide-38
SLIDE 38

Professional Development

  • Assignment design

– “Many student work artifacts are not a great fit for the QL rubric” – “I often felt as if I was assessing the assignment design rather than the student's work. I gave many zeros simply because the assignment did not fit the rubric.”

  • Thinking about learning outcomes

– “I think I have a better sense of what makes good writing.”

  • Desire for more professional development

– “I would appreciate getting more feedback about my scoring.” – “I manage assessment on my campus. Scoring this work gave me insight into assignment design that I can take to my faculty.”

slide-39
SLIDE 39

Future Directions

  • Validity- how rubrics are actually being used beyond

recommendations made by AAC&U

  • Full report on validity of rubrics
  • Full results of three years of MSC
  • Possible rubric revisions
slide-40
SLIDE 40

References

AERA, APA, NCME. (2014). Standards for educational and psychological testing. Washington, DC: American Educational Research Association. Finley, A. (2011). How reliable are the VALUE rubrics? Peer Review 13/14(4/1), 31-34. Retrieved from http://www.aacu.org/publications-research/periodicals/how-reliable- are-value-rubrics McConnell, K. & Rhodes, T. (2017). On solid ground: VALUE report 2017. Washington D.C.: AAC&U. Rhodes, T. L., & Finley, A. (2013). Using the VALUE rubrics for improvement of learning and authentic assessment. Washington D.C.: Association of American Colleges and Universities. Rhodes, T. L. (2011). Emerging evidence on using rubrics. Peer Review 13/14(4/1). Retrieved from http://www.aacu.org/publications-research/periodicals/emerging-evidence- using-rubrics Rhodes, T. L. (2010). Assessing outcomes and improving achievement: Tips and tools for using

  • rubrics. Washington D.C.: Association of American Colleges and Universities.