PBL Assessment Fit for Purpose Ingrid Scholten, Ed.D. Flinders - - PDF document

pbl assessment fit for purpose
SMART_READER_LITE
LIVE PREVIEW

PBL Assessment Fit for Purpose Ingrid Scholten, Ed.D. Flinders - - PDF document

25/05/2017 PBL Assessment Fit for Purpose Ingrid Scholten, Ed.D. Flinders University Overview Why focus on assessment? What do we want from assessment in PBL? What are the challenges? How can assessment be used to promote


slide-1
SLIDE 1

25/05/2017

PBL Assessment Fit for Purpose

Ingrid Scholten, Ed.D. Flinders University

Overview

  • Why focus on assessment?
  • What do we want from assessment in PBL?
  • What are the challenges?
  • How can assessment be used to promote learning?
  • What are some of the key features?
  • How can effective processes be sustained?
  • What is programmatic assessment?
slide-2
SLIDE 2

25/05/2017

  • It works!
  • Improved performance when compared with

traditional curricula Strobel & Van Barneveld, 2009

Why PBL?

  • Generates a stimulating and challenging educational environment
  • “Hard Fun” Barrett, 2004; Papert, 2002
  • Students take charge of their learning
  • Facilitates positive attitudes to learning (meta analysis: Demirel & Dagyar, 2016)
  • ? Improved metacognition
  • Authentic materials from day one
  • Promote skills required for future practice
  • Communication skills, interpersonal skills, teamwork, problem solving, reasoning, critical thinking,

independent responsibility

slide-3
SLIDE 3

25/05/2017

Why focus on assessment?

  • Assessment is viewed by the students as the driver of learning
  • Regardless of
  • Intended LOs
  • Curriculum design
  • Nature of content
  • Teaching excellence
  • Assessment is powerful
  • Influences what and how students learn
  • Engage, include and retain students
  • Stimulate them to take a deep approach to learning

(Sambell, 2016)

  • Source of greatest student dissatisfaction

(Soilemetzidis et al., 2014)

  • Academic stress disrupts cortical plasticity; hinders learning

(Concerto et al., 2017)

Why is it especially important for PBL?

  • Learning outcomes in PBL include not only skills and knowledge, but

attitude and disposition

  • PBL facilitates development of this “way of being”
  • Far‐reaching effects; students
  • Acquire knowledge and skills
  • Integrate what they have learned
  • Form their identities – personal, professional and academic
  • Measuring/assessing “attitude” and “disposition” is challenging
  • Traditional testing culture, with high stakes exams
  • Inconsistent with PBL
  • Continuous and reiterative student participation
slide-4
SLIDE 4

25/05/2017

What’s wrong with the status quo?

  • Traditional testing culture
  • Heavily reliant on psychometric models in test development
  • Strongly influenced by notions of objectivity and fairness
  • Traditional assessment tools and practices limited in scope
  • Test predominantly disciplinary or declarative knowledge
  • Drive teaching for assessment
  • Treat assessment as a separate, post‐teaching activity
  • Focus on individual achievement
  • Focus on artificial constraints, such as prescribed duration; high stakes, big

summative final exam

  • Assess only the learning product

Why do we buy into it?

  • Insidious contextual factors
  • Spiral down to teachers and students, maintaining the status quo
  • (Contrived) declining resources, pressure for increased research

productivity and less time for teaching and student interaction

  • Increasing class sizes
  • Computer‐aided marking
  • Low‐level questions easier to design and “more objective”
  • Concerns about declining standards
  • Caution re changes to assessment policies, innovative assessments
  • Increased formal legal requirements
  • Domination properties of traditional assessment
  • Teachers maintain power, authority and control
  • Censorship – exams used to define what knowledge is worthy of acquiring
slide-5
SLIDE 5

25/05/2017

What do we want from assessment in PBL?

  • All assessment should
  • Contribute to helping students to learn and succeed
  • Inform both students and teachers
  • Communicate assessment criteria and standards
  • Permit planning of future learning activities
  • Be diverse, inclusive and responsive
  • Offer choice and flexibility
  • Assess both product and processes
  • Encompass knowledge, skills and dispositions
  • Measure multidimensional aspects of competencies
  • Provide prompt and constructive feedback
  • Allow access to tools or resources used in real life
  • Interesting and authentic
  • Avoid time pressure
  • Involve multiple assessments cf single
  • Assessment in PBL should
  • Be aligned with course/module objectives/outcomes
  • Be authentic
  • Assess process‐based professional activity
  • Reflect learners’ development from novice to expert*
  • Include student self‐assessment and reflection

Macdonald & Savin‐Baden, 2004 *Supported by a variety of taxonomies/frameworks

slide-6
SLIDE 6

25/05/2017

Biggs, 1996; Biggs & Tan, 2011

What is curriculum alignment? What is authentic assessment?

  • Authenticity
  • Tasks have realistic value
  • Question of credibility; validity; fidelity; competence
  • Performance, or holistic assessment
  • Underlying motivator for active student engagement and involvement
  • Promotes understanding
  • Students develop their personal and professional identities
  • See the relevance and meaning of the assessment
  • Engaged cognitively, but also affectively
  • All disciplines can embed authenticity into assessment tasks
slide-7
SLIDE 7

25/05/2017

  • Authentic assessment is not about reliability
  • It’s about making a judgement of a student’s competence
  • The tasks are often complex and open ended
  • Decisions are frequently based on multiple scores and assessors
  • Avoid pretence
  • If construed by students as artificial, may have unintended consequences
  • Focus on the most salient characteristics of learning (LOs)
  • Assessment should become increasingly “authentic” across the program
  • Designing authentic assessments
  • Match real‐world tasks
  • Create products
  • Relatively undefined and open to multiple interpretations and solutions
  • But with clear, comprehensible criteria and standards
  • Collaborative
  • Enable learners to make choices and reflect on their learning
  • Encourage students to adopt diverse perspectives and roles
  • Weave summative assessment seamlessly throughout the program in ways

that reflect real‐world evaluation processes (Baartman, Prins, Kirschner and van der Vleuten, 2007; Lombardi, 2007)

slide-8
SLIDE 8

25/05/2017

Examples of authentic assessments

  • Plans and models
  • Designs
  • Reports
  • Case history/interview; assessment; session notes; discharge or transfer summary …
  • Lab report
  • Literature searches and reviews
  • Care plans; session plan
  • Evidence‐based questions; critical appraisals
  • Applications
  • Funding; employment
  • Decision‐making activities
  • E.g. shortlisting candidates
  • Employment; waiting lists
  • Presentations
  • Case study; conference poster; platform presentation
  • Individual; group
  • Portfolios
  • More interesting to mark, too!
slide-9
SLIDE 9

25/05/2017

How do we facilitate lifelong learning skills and dispositions?

  • Self‐regulated learning
  • Active control and monitoring by students of aspects of their own learning
  • Important to develop
  • Metacognition
  • Thinking about the contents and processes of one’s mind (Winnie & Azevedo, 2014)
  • Critical for self‐regulated learning
  • Reflection
  • Self‐regulation and critical reflection promote autonomy and self‐directed learning

skills (Knight & York, 2003; Tan, 2007)

  • Explicit reflection activities prompt consideration of a learning event
  • Such “mindfulness” facilitates the capacity to make informed judgement, valuable to

future learning (Boud, 2007)

  • Learning is an iterative process
  • Requires social participation++ and interactions
  • Sequential formative assessments
  • Engage students in an active, inclusive social space
  • Students learn the practices of their disciplines or professions
  • Facilitates learning from mistakes/errors
  • Mistakes are forms of feedback that “something is wrong”
  • Offer feedforward to students about why it is incorrect and how to improve and progress
  • Feedforward
  • Provided by peers and teachers, and students themselves
slide-10
SLIDE 10

25/05/2017

  • Integrative assessments provide students with opportunities to
  • Engage with meaningful tasks that have inherent worth
  • Define standards and expectations in their responses
  • Make judgements about their own learning or performance
  • Track and analyse their approaches to responding to a problem, issue,

situation or performance

  • Integrate prior or current feedback into their response
  • Be rewarded for the quality of their analysis of their metacognitive abilities,

rather than factual knowledge or specific performance (Crisp, 2012, p. 41)

What is the role of feedback?

  • Feedback is critical in the development of students’ self‐regulated

learning

  • 1. Helps clarify good performance
  • 2. Facilitates development of reflection and self‐assessment
  • 3. Delivers high‐quality information to students
  • 4. Encourages dialogue around learning
  • 5. Encourages positive motivational beliefs and self‐esteem
  • 6. Provides opportunities to close the gap between current and desired

performance

  • 7. Provides information to teachers

Nicol & Macfarlane‐Dick, 2006

slide-11
SLIDE 11

25/05/2017

  • Should be helpful, timely and developmental
  • Issues
  • Typically feedback is negatively viewed
  • Seen as rare; not recognised
  • Too much written feedback can be disheartening
  • If personal, feedback can be detrimental; damage students’ self‐esteem
  • May come too late and is not accessed – can’t have any remedial benefit
  • Making value judgements and providing high quality feedback is challenging
  • Sustainable feedback practices are required
  • Position feedback as an experience with the development of student self‐regulation at

the core (Carless, 2013)

Should we use formative or summative assessment?

  • Formative assessment facilitates learning in a safe, supportive

learning environment

  • Can be graded/marked
  • Learn from guidance on summative assessment
  • Discuss the tasks and requirements; provide exemplars
  • Give generic information before students complete the assignment
slide-12
SLIDE 12

25/05/2017

How can we sustain effective feedback practices?

  • View assessment and feedback as a relational practice
  • Dialogic, dynamic and open to negotiation
  • Explicitly build conversations about feedback into curriculum designs
  • Learner engagement in formative activity supports self‐regulation
  • Emphasise process as well as product
  • Encourage students to see themselves as active agents in their own

learning

  • Support learners in using feedback to guide their learning
  • Future‐focussed, developmental comments
  • Comments need to be appropriately interpreted and acted upon
slide-13
SLIDE 13

25/05/2017

  • Strategies
  • Staged assignments
  • Pod‐casts or audio‐feedback
  • Permit speedier summative commenting procedures
  • Comment‐only marking
  • Withhold marks until students have engaged with feedback comments
  • Blend any criticism with praise and attend carefully to tone
slide-14
SLIDE 14

25/05/2017

What supporting frameworks are available?

  • Rubrics
  • Proforma
  • E.g. Reflection (DIEP; What? So what? Now what?)
  • Taxonomies
  • E.g. Bloom’s Revised (Anderson & Krathwohl, 2001); SOLO (Biggs & Collis,

1982); Levels of Reflection (Smith & Pope, 1990; Wong et al, 1995); Miller’s Triangle (1990) …

  • Curriculum mapping
  • Alverno model (Loacker; Mentkowski et al, 2016)
  • Programmatic assessment (van der Vleuten, 2005)

How can we sustain good assessment practice?

  • Formative assessment takes time
  • For academics and students
  • Workload formula
  • Takes time from research activities
  • Any more is “discretionary” practice
  • Students perceive too many assessments

 time‐poor academics may favour summative assessment over formative

  • Questionable timing, quality, consistency and quantity of feedback on

summative assessment

  • Consider a programmatic approach (Jessop, El Hakim, & Gibbs, 2014;

Mentkowski et al, 2016; van der Vleuten, 2005)

slide-15
SLIDE 15

25/05/2017

  • Well‐designed, carefully integrated and sequenced assessment
  • Encourages deep approaches to learning
  • Inspires students to devote high levels of effort and ‘time on task’
  • Consistently over the course of a programme (Gibbs, 2006)
  • Efficient, coordinated assessment across a programme

 better balance between formative and summative assessment

  • Develop more challenging tasks which integrate learning across modules
  • Incorporate cycles of feedback
  • Focus on development and feed forward across tasks and modules

(Hounsell, McCune, Hounsell, & Litjens, 2008)

  • Holistic approaches to assessment for learning, with good assessment

tasks

  • Central to the learning environment
  • Permeate the entire curriculum
  • Judicious and sparing use
  • Encourage students to develop as learners
  • Support learners to engage with disciplinary ways of thinking and

practising

  • Focus on the ‘big picture’ and gradual development of expertise
slide-16
SLIDE 16

25/05/2017

What is programmatic assessment?

  • Fits with a constructivist view on education
  • Integrative, more holistic view on assessment
  • Assessment optimised differently for different purposes
  • Collectively, assessment is robust for stimulating good learning and valid decision‐

making

  • Neutral in the choice of assessment method
  • Depends on educational justification
  • Implementation is challenging
  • Requires completely different orientation
  • Demanding of teachers and organisations
  • Different assessment culture
  • Learning oriented view
  • Individual assessment pieces provide limited information
  • Unsuitable for high stakes decisions
  • Provide feedback
  • Pass/fail decisions removed; focus on feedback
  • Quantitative or Qualitative, depending on the assessment method
  • More complex skills deserve more qualitative feedback
  • Offer potential for learning and rich information
  • Confident decisions only made with sufficient data points
  • Very high stakes decisions should be based on many points

Schuwirth & van der Vleuten, 2011

slide-17
SLIDE 17

25/05/2017

  • Map assessments across the program
  • Create connections, sequences, timing and logical flow
  • Balance summative and formative assessments

 Meaningful and purposive learning

  • Program committee determines student progress
  • Mentoring
  • Learners supported in using feedback to guide learning
  • Mentors guide and coach the learner longitudinally
  • Promotes better learning
  • Play a role in defining remediation activities

How do we manage the threats?

  • Keep the faith!
  • Spread the word…
  • Develop and nurture a shared collegial culture
  • Regular collaborative team discussions
  • Generate and review assessment tasks, criteria etc.
  • Staff and student marking workshops
  • Collaborative online marking
  • Comments can be open and visible to enable sharing of good practice
  • Mentoring for new team members
  • Visit Alverno College
  • Biannual assessment workshops
  • https://www.alverno.edu/academics/resourcesforeducatorsresearchers/
slide-18
SLIDE 18

25/05/2017

Conclusion

  • An authentic assessment paradigm consistent with PBL philosophy is

a lot of work – so, is it worth it?

  • Consider students’ future context and the anticipated demands
  • And how they are (mis)aligned with assessments
  • Appropriate assessment promises to meet the demands of the community in

which students will be working

  • Students must be comfortable with real‐world complexities
  • University needs to prepare them to deal with everyday ambiguities, technologies and

resources

  • It matters that students are assessed authentically so that the required

knowledge, skills and dispositions can be practised now

slide-19
SLIDE 19

25/05/2017

References

  • Boud, D. & Associates (2010). Assessment 2020. Seven propositions for assessment reform in higher education. Retrieved from

http://www.uts.edu.au/sites/default/files/Assessment‐2020_propositions_final.pdf

  • Kek, M. & Huijser, H. (2017). Assessing Agile PBL. In Problem‐based learning into the future (pp 79‐104). Springer Singapore
  • Mentkowski, M., Diez, M. E., Lieberman, D., Mace, D. P., Rauschenberger, M., & Abromeit, J. (2016). Conceptual elements for

performance assessment for faculty and student learning. In Assessing competence in professional performance across disciplines and professions (pp. 11‐38). Springer International Publishing.

  • Nicol, D. J., & Macfarlane‐Dick, D. (2006). Formative assessment and self‐regulated learning: A model and seven principles of good

feedback practice. Studies in Higher Education, 31(2), 199‐218.

  • Price, M., Carroll, J., O’Donovan, B., & Rust, C. (2011). If I was going there I wouldn’t start from here: a critical commentary on

current assessment practice. Assessment & Evaluation in Higher Education, 36(4), 479‐492.

  • Sambell, K. (2016). Assessment and feedback in higher education: considerable room for improvement? Student Engagement in

Higher Education Journal, 1(1).

  • Schuwirth, L. W., & Van der Vleuten, C. P. (2011). Programmatic assessment: from assessment of learning to assessment for
  • learning. Medical Teacher, 33(6), 478‐485.
  • van der Vleuten, C. P. (2016). A programmatic approach to assessment. Medical Science Educator, 26(1), 9‐10.
  • van der Vleuten, C. P., Schuwirth, L. W. T., Driessen, E. W., Govaerts, M. J. B., & Heeneman, S. (2015). Twelve tips for programmatic
  • assessment. Medical Teacher, 37(7), 641‐646.