value lessons learned
play

VALUE: Lessons Learned Kate McConnell & Erin Horan Association - PowerPoint PPT Presentation

VALUE: Lessons Learned Kate McConnell & Erin Horan Association of American Colleges and Universities Todays session Overview of the MSC Scoring and reporting Lessons learned as related to validity Unintended consequences?


  1. VALUE: Lessons Learned Kate McConnell & Erin Horan Association of American Colleges and Universities

  2. Today’s session • Overview of the MSC • Scoring and reporting • Lessons learned as related to validity • Unintended consequences? Faculty development • Not as much on state and federal policies

  3. Terminology • VALUE - Valid Assessment of Learning in Undergraduate Education • SHEEO - State Higher Education Executive Officers Association • MSC - Multi State Collaborative

  4. VALUE Approach to Assessment

  5. http://www.aacu.org/OnSolidGroundVALUE

  6. Lessons Learned from Refinement Year Surveys • Faculty viewed rubrics as valid constructs of the learning outcomes • It is important to have data analysis in mind before collecting data – Output as numbered categories rather than long, wordy string variables (e.g., state- coding systems already exist, term) – Be sure output file is something you can work with (csv, no random spacing in variable names) – Have data entered with choices rather than writing out the institution name (small typos make a mess) • It is important to provide models for reporting and displaying data – e.g., SHEEO provided results for each state in the MSC

  7. VALUE Initiative Results for the Refinement Year • Includes all Institutions- Public and Private • 75% Completion – 2-year institutions = 45+ Credit Hours – 4-year institutions = 90+ Credit Hours • Critical Thinking: 5 Dimensions – 2-year institutions, 45+ credit hours: 1,283 Pieces of student work – 4-year institutions, 90+ credit hours: 2,006 Pieces of student work

  8. VALUE Initiative Results for the Refinement Year 75% Completion

  9. So what’s missing? • Validity – Rubric design – Rubric application, score interpretation, and use

  10. Standards for Test Design and Development (ch 4) 1. Standards for Test Specifications 2. Standards for Item Development and Review 3. Standards for Developing Test Administration and Scoring Procedures and Materials 4. Standards for Test Revisions

  11. VALUE Timeline • When were the VALUE rubrics released?

  12. VALUE Timeline • 2008 Rubric development with VALUE Partner Campuses • 2009 National Review Panel (Rhodes, 2011) • 2009 Initial release of VALUE rubrics • 2010 National Inter-Rater Reliability Study (Finley, 2011) • 2011 Case studies of institutional use (Finley and Rhodes, 2013) • 2014-2017 MSC

  13. 1. Standards for Test Specification • All have the same structure

  14. 1. Standards for Test Specification Ease of use, instructions given by test administrators • All have the same structure • AAC&U specifies they should be used when scoring for assessment not for grading • Leadership campuses tested VALUE rubrics for ease of use

  15. 2. Standards for Item Development and Review • Documentation of rubric development process- teams, advisory board, testing with partner campuses, at least three rounds of drafting (Rhodes, 2009) • National Review Panel to test rubrics before release- .8 reliability without training (Rhodes, 2011) • MSC showed faculty found rubrics to encompass key elements of each learning outcome (McConnell and Rhodes, 2017) • Assignment design

  16. 3. Standards for Developing Test Administration and Scoring Procedures and Materials AAC&U offers recommendations related to • training and scoring (Rhodes and Finley, 2013) • Presenting and reporting data (McConnell and Rhodes, 2017)

  17. 4. Standards for Test Revisions • Future work: assignment (re)design, rubric revisions

  18. Before moving on to the exciting extra effects… • Questions related to validity? • Questions for Terry?

  19. How the rubrics encourage support for assessment • What diminishes campus support • How to encourage support • How the VALUE approach accomplishes this

  20. How to encourage support • Expectancy-Value Theory of motivation – Expectancy- ability to accomplish task – Value- perceived importance – Cost- sacrifice MacDonald, S. K., Williams, L. M., Lazowski, R. A., Horst, S. J., & Barron, K. E. (2014). Faculty attitudes toward general education Assessment: A qualitative study about their motivation. Research & Practice in Assessment, 9 .

  21. What diminishes campus support? • Imposed by external sources • Fail to understand purpose • Threat academic freedom • Additional responsibilities with no incentives • Disconnect to everyday classroom activity

  22. How to encourage support • Imposed by external sources • Involve faculty throughout the process • Fail to understand purpose • Portray an intrinsic desire to learn • Threat academic freedom from assessments • Additional responsibilities with no incentives • Disconnect to everyday classroom activity

  23. How to encourage support • Imposed by external sources • Invest in training • Fail to understand purpose • Provide ongoing support • Threat academic freedom • Create usable, digestive reports • Additional responsibilities with no incentives • Disconnect to everyday classroom activity

  24. How to encourage support • Imposed by external sources • Use faculty created, course- embedded assessments • Fail to understand purpose • Threat academic freedom • Additional responsibilities with no incentives • Disconnect to everyday classroom activity

  25. How to encourage support • Imposed by external sources • Count assessment towards scholarship • Fail to understand purpose • Offer payment for trainings • Threat academic freedom • Additional responsibilities with no incentives • Disconnect to everyday classroom activity

  26. How to encourage support • Imposed by external sources • Involve faculty by including their own disciplinary interests • Fail to understand purpose • Threat academic freedom • Additional responsibilities with no incentives • Disconnect to everyday classroom activity

  27. How to encourage support • Involve faculty throughout the process • Portray an intrinsic desire to learn from assessments • Invest in training • Provide ongoing support • Create usable, digestive reports • Use faculty created, course-embedded assessments • Count assessment towards scholarship • Offer payment for trainings • Involve faculty by including their own disciplinary interests

  28. Professional Development • Feedback from scorers summer 2017, Refinement Year

  29. Professional Development “I found this training to be very good in terms of my professional development as a professor. I scored papers from many disciplines as well as my own and can now see how to work more closely with my students to further assist in getting back work that I expect from them.”

  30. Professional Development • Assignment design – “Many student work artifacts are not a great fit for the QL rubric” – “I often felt as if I was assessing the assignment design rather than the student's work. I gave many zeros simply because the assignment did not fit the rubric.” • Thinking about learning outcomes – “I think I have a better sense of what makes good writing.” • Desire for more professional development – “I would appreciate getting more feedback about my scoring.” – “I manage assessment on my campus. Scoring this work gave me insight into assignment design that I can take to my faculty.”

  31. Future Directions • Validity- how rubrics are actually being used beyond recommendations made by AAC&U • Full report on validity of rubrics • Full results of three years of MSC • Possible rubric revisions

  32. References AERA, APA, NCME. (2014). Standards for educational and psychological testing . Washington, DC: American Educational Research Association. Finley, A. (2011). How reliable are the VALUE rubrics? Peer Review 13/14 (4/1), 31-34. Retrieved from http://www.aacu.org/publications-research/periodicals/how-reliable- are-value-rubrics McConnell, K. & Rhodes, T. (2017). On solid ground: VALUE report 2017. Washington D.C.: AAC&U. Rhodes, T. L., & Finley, A. (2013). Using the VALUE rubrics for improvement of learning and authentic assessment. Washington D.C.: Association of American Colleges and Universities. Rhodes, T. L. (2011). Emerging evidence on using rubrics . Peer Review 13/14 (4/1). Retrieved from http://www.aacu.org/publications-research/periodicals/emerging-evidence- using-rubrics Rhodes, T. L. (2010). Assessing outcomes and improving achievement: Tips and tools for using rubrics . Washington D.C.: Association of American Colleges and Universities.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend