Helping Districts Design Balanced Assessment Systems National - - PowerPoint PPT Presentation

helping districts design balanced assessment systems
SMART_READER_LITE
LIVE PREVIEW

Helping Districts Design Balanced Assessment Systems National - - PowerPoint PPT Presentation

Helping Districts Design Balanced Assessment Systems National Conference on Student Assessment CCSSO June 27-29, 2018 San Diego, CA. Session Outcomes Share a new resource the District Assessment System Design Toolkit Share


slide-1
SLIDE 1

Helping Districts Design Balanced Assessment Systems

National Conference on Student Assessment – CCSSO June 27-29, 2018 San Diego, CA.

slide-2
SLIDE 2

Session Outcomes

  • Share a new resource – the District Assessment System Design

Toolkit

  • Share process and observations about user experience with the

Toolkit, based on initial Proof of Concept Pilot

slide-3
SLIDE 3

Di District Assessment System De Design Toolkit Pilot

Ka Kathy Dewsbury-Wh White | President & CEO Ed Ed Roeber | Assessment Director El Ellen Vorenkamp | Assessment Consultant St Stephen Sn Snead | | Su Supervisor, Curriculum & Assessment Unit Jo Jonathan Flukes | Research, Evaluation, & Assessment Consultant Jo Joseph Martineau | Senior Associate Bl Bloomfield d Hills Scho hools De Dearborn Schools Nov Novi Community School District

slide-4
SLIDE 4

Why Develop the Toolkit? Why Conduct a Proof of Concept Pilot?

slide-5
SLIDE 5

To Top 3 Reasons Why We Should Invest in De Development of Assessment Literacy

  • Assessment used only as an accountability tool will never result in
  • improvement. In other words, used only to certify status is not an

effective school improvement strategy.

  • Using assessment well is essential to meeting NEW mission of a public

education, i.e. assist ALL students to meet high standards

  • Cognitive science tells us we can only be effective with our craft if we use

assessment well i.e. execute effective assessment practice and balanced, comprehensive assessment systems. In other words, effective teaching and learning practice depends on effective/integrated assessment practice.

slide-6
SLIDE 6

Resource Response – National Perspective

slide-7
SLIDE 7

Mo Motivating Questions

  • Do you have a coherent and effective district and school assessment

system that complements and enhances instruction?

  • Does it serve student, teacher, administrator, and policymaker needs, and

with the correct balance?

  • Was the system designed so that teachers, principals, and administrators

share power in the service of providing students the best possible education? Or does the collection of tests feel more like a collection of bricks than a well-designed house?*

* * Coladarci, T. (2002). Is it a House…Or a Pile of Bricks? Important Features of a Local As Assessment System. Th The Phi Delta Kappan, 83(1 (10). ). pp. 772-774. 774.

slide-8
SLIDE 8

So Some Symptoms of Ha Having a Pile of Bricks

  • Purposes of Assessment
  • Unclear, misunderstood, understood differently by different people.
  • Given by tradition, but purpose has been forgotten.
  • “System creep,” with new tests added but old ones rarely dropped.
  • Appropriate Use
  • Type of assessment may not be appropriate for the intended purpose.
  • Test data are rarely used (if at all) when data become available.
  • New tests added without explicit attention to appropriate use.
  • It feel like some tests will crumble under the weight of use.
  • Coherence and Fit with Instruction
  • State, district, and/or classroom tests conflict in timing, content standards, and/or

results.

  • It feels like, overall, testing disrupts rather than facilitates instruction.
slide-9
SLIDE 9

Cha Challeng nges in n Cr Creating ng a Hous use Rathe her tha han n a Pile of Br Bricks Ad Addressed in the Toolkit and As Associated Facilitated Process

  • Achieving a shared assessment vocabulary
  • Achieving a shared understanding of intended

purposes of assessment

  • Achieving a shared understanding of the match

between purpose and types/characteristics of assessment

  • Designing a parsimonious system that minimizes

duplication (and thus the amount of potential instructional time devoted to formal test taking)

slide-10
SLIDE 10

Cha Challeng nges in n Cr Creating ng a Hous use Rathe her tha han n a Pile of Br Bricks Ad Addressed in the Toolkit and As Associated Facilitated Process

  • Designing a coherent system in which components are

consistent with an overall vision and complementary to each other in drawing a balanced and non-contradictory picture of student learning

  • Developing a productive approach to power sharing across

levels of responsibility (e.g., teacher, principal, central office, superintendent, school board)

  • Developing a sound plan to implement the newly designed

system that attends to potential barriers

  • Developing a sound plan to maintain the system over time

so that it evolves as a system to meet changing needs.

slide-11
SLIDE 11

Affirming National Need at State & Local Level

  • Changes in the state’s education landscape have

placed much emphasis on state accountability assessments.

  • Vendor created assessment solutions offer lots of

conveniences…in isolation from one another.

  • Frequent changes in multiple levels of local LEA

leadership have led to a lack of historical institutional knowledge.

  • Difficult for regional service agencies to support

LEAs who don’t have a clear picture of their assessment “system”

  • The number of positions solely dedicated to

assessment is growing.

  • Momentum is building to think critically about the

role of assessment in K-12 education.

  • Interest in professional learning on assessment is on

the rise.

  • Emphasis on state & national assessment have

helped put a spotlight on classroom assessment

slide-12
SLIDE 12

Proof of Concept Pilot

A proof of concept (POC) or a proof of principle is a realization of a certain method or idea to demonstrate its feasibility, or a demonstration in principle, whose purpose is to verify that some concept or theory has the potential of being used effectively… Two aims…

  • Refine the District Assessment System Design Toolkit
  • Determine what type of facilitated support might be necessary or

desirable to assist a district using the Toolkit

  • 1. Kathy to Joseph…

“Do you want to try it

  • ut with some real

districts?”

  • 2. Joseph to Kathy…

“Yeah, that’s a good idea. We would do POC Pilot.”

slide-13
SLIDE 13

Description of the Toolkit & the Pilot

slide-14
SLIDE 14

Wh What is the District Assessment System De Design Toolkit – ma materials include…

Tools for…

  • Performing an assessment audit
  • Summarizing the results of the

assessment audit

  • Implementing a principled

system design process

  • Comparing the results of the

audit and the design process Templates for…

  • Feedback from participants
  • Compiling potential barriers to

implementation

  • Developing a project plan for

implementation [including strategies to address potential barriers]

  • Creating a design document
  • Developing an project

implementation plan

  • Developing a comprehensive report
  • n the results of the process.
slide-15
SLIDE 15

In Included in in the Toolk lkit it and Associa iated Fa Facilitated Process

  • Professional learning regarding…
  • Types and characteristics of assessment
  • Purposes of assessment
  • Matching types and characteristics to purpose
  • Integration of all educator/leader roles within the district
  • Principled processes to…
  • Audit an existing district assessment system
  • Select and prioritize intended purposes of the district assessment system
  • Design/refine a district assessment system to optimally match intended purposes
  • Develop a plan to address potential barriers to implementation
  • A comprehensive end-of-process report that…
  • Describes the principled process
  • Incorporates a design document based on district team decisions
  • Incorporates a set of principles for allowing the system to evolve to meet changing needs, but that

builds on the process already completed.

slide-16
SLIDE 16

Wh Where the Toolkit Stops

The toolkit and associated facilitated process stops with a design that identifies prioritized intended purposes, the types and characteristics

  • f the assessments that will be used to meet them, and a listing of

potential barriers to implementation with associated potential solutions.

slide-17
SLIDE 17

No Not Included in in the Toolk lkit it and Associa iated Fa Facilitated Process

  • Identification of specific assessments to be used
  • Identification of specific professional learning
  • A specific implementation plan
  • A specific maintenance plan
  • Additional professional learning will need to be targeted for different

types of educators within the district. The MAC’s assessment learning network is a great place for targeted professional learning

  • pportunities.
slide-18
SLIDE 18

Di District Assessment System De Design Toolkit Pilot

Ka Kathy Dewsbury-Wh White | President & CEO Ed Ed Roeber | Assessment Director El Ellen Vorenkamp | Assessment Consultant St Stephen Sn Snead | | Su Supervisor, Curriculum & Assessment Unit Jo Jonathan Flukes | Research, Evaluation, & Assessment Consultant Jo Joseph Martineau | Senior Associate Bl Bloomfield d Hills Scho hools De Dearborn Schools Nov Novi Community School District

slide-19
SLIDE 19

Pr Process to Pilot Use of the DASD

  • Three approximately 4-hour workshops
  • LEA teams supported by their ESEA consultant and a project wiki
  • Sending a team representing teachers, building leadership, central
  • ffice staff, and district leadership
  • Advance reading materials to minimize the

number and duration of meetings.

  • Homework between sessions.
  • From populated Toolkit, with customized

responses – district develops plan of action

slide-20
SLIDE 20

What a District Gets From Using the DASD Toolkit

A comprehensive report detailing…

  • The deliberative process
  • Representation on the district team
  • An assessment system design document and associated schematic

representation

  • A compilation of potential barriers to implementation with

associated strategies for addressing them

  • A set of principled questions to address ongoing maintenance of a

system

slide-21
SLIDE 21

Who Did What…

Michigan Assessment Consortium Regional Educational Service Agency Consultants Center for Assessment

  • Developed agreements &

understandings

  • Directed project

management team

  • Funded meeting expenses

(non-personnel)

  • Provided evaluation service
  • Herded the cats

(communication maintained, expectations nudged)

  • Enlisted district participation with

identification of district lead/contact.

  • Facilitated support for LEA team during

workshop meetings

  • Facilitated support in between

workshop meetings

  • Offered some topical content

presentation (as needed) and facilitated support to ensure interaction and increase understanding/ engagement

  • Developed and

refined Toolkit

  • Developed agendas

for workshops to support conceptual framework for the pilot

  • Presented content

during workshops

  • Responded to

district homework

slide-22
SLIDE 22

Roles Anticipated by the Toolkit

  • Facilitation Team
  • Project Champion (the Superintendent or her designee)
  • External Facilitator (an expert in assessment with knowledge of local, state,

and national issues)

  • District Liaison (the district’s point person on the project)
  • Meeting Coordinator (the project logistics coordinator)
  • Design Team
  • Representatives from levels of responsibility (district, school, classroom)
  • Representatives from levels of schools (elementary, middle, high)
  • Representatives from content areas and specialty areas
slide-23
SLIDE 23

Toolkit Structure

slide-24
SLIDE 24

Toolkit Structure

  • To introduce a

potentially interested district liaison to the toolkit and associated process

  • To introduce a potential

district champion to the toolkit and associated process

slide-25
SLIDE 25

Toolkit Structure

  • User Guide for the

External Facilitator and District Liaison

slide-26
SLIDE 26

Toolkit Structure

slide-27
SLIDE 27

Toolkit Structure

  • A project report

template

slide-28
SLIDE 28

Toolkit Structure

  • A locked, completed

report from a hypothetical district

slide-29
SLIDE 29

Toolkit Structure

  • An unlocked version of

the exemplar to copy from or use as a base

slide-30
SLIDE 30

Toolkit Structure

slide-31
SLIDE 31

Toolkit Structure

  • The workhorse of the

Toolkit – a blank project workbook

slide-32
SLIDE 32

Toolkit Structure

  • Locked exemplar

completed workbook

  • Unlocked version of the

exemplar to tinker with to see how the workbook works

slide-33
SLIDE 33

Toolkit Structure

slide-34
SLIDE 34

Toolkit Structure

  • Workshop materials for

each workshop. Includes materials for the facilitation team and design team

slide-35
SLIDE 35

Toolkit Structure

  • The results of the

process

slide-36
SLIDE 36

Workshop #1

  • Build a common assessment vocabulary
  • Types of assessment
  • Units of curriculum and instruction
  • Purposes of assessment
  • Introduce the district assessment audit worksheet in the project

workbook

  • Start conducting the audit
  • Homework – finishing the audit
slide-37
SLIDE 37

Workshop #2

  • Review formative assessment in depth
  • Select the intended purposes of the assessment system
slide-38
SLIDE 38

Select purposes, which may differ by school level.

slide-39
SLIDE 39

Workshop #2

  • Review formative assessment in depth
  • Select the intended purposes of the assessment system
  • Prioritize the selected purposes
slide-40
SLIDE 40

All selected purposes are important because they were selected. Now the task is to prioritize them within the set as critical, high, mid, or low priority.

slide-41
SLIDE 41

Workbook identifies the match of each purpose to the various types and characteristics of assessment. 4 = ideal match 3 = strong match 2 = moderate match 1 = weak match blank = no match

slide-42
SLIDE 42

Workshop #2

  • Review formative assessment in depth
  • Select the intended purposes of the assessment system
  • Prioritize the selected purposes
  • Identify the type(s) and characteristics of assessment to be used to

fulfill each purpose:

  • Starting with the highest priority and working down
  • Best matches are the most important for highest priority purposes
  • A system can be made more parsimonious by using, for a lower-priority

purpose, an assessment also used for a higher-priority-purpose as long as the degree of match is not too low.

slide-43
SLIDE 43

An “X” identifies the types and characteristics of assessments used for the each purpose, guided by the degree of match between purpose and type/characteristic.

slide-44
SLIDE 44

Workshop #2

  • Review formative assessment in depth
  • Select the intended purposes of the assessment system
  • Prioritize the selected purposes
  • Identify the type(s) and characteristics of assessment to be used to

fulfill each purpose

  • Review and make any necessary edits to the design matrix
slide-45
SLIDE 45

Workshop #2

  • Review formative assessment in depth
  • Select the intended purposes of the assessment system
  • Prioritize the selected purposes
  • Identify the type(s) and characteristics of assessment to be used to

fulfill each purpose

  • Review and make any necessary edits to the design matrix
  • Review the results of the audit
slide-46
SLIDE 46

The numbers in each cell indicates the number of entries in the audit that fit in the cell (the purpose of the assessment and the type of assessment or its characteristics

slide-47
SLIDE 47
slide-48
SLIDE 48

Workshop #2

  • Review formative assessment in depth
  • Select the intended purposes of the assessment system
  • Prioritize the selected purposes
  • Identify the type(s) and characteristics of assessment to be used to

fulfill each purpose

  • Review and make any necessary edits to the design matrix
  • Review the results of the audit and a comparison to the new design

matrix

slide-49
SLIDE 49
slide-50
SLIDE 50

Workshop #2

  • Review formative assessment in depth
  • Select the intended purposes of the assessment system
  • Prioritize the selected purposes
  • Identify the type(s) and characteristics of assessment to be used to

fulfill each purpose

  • Review and make any necessary edits to the design matrix
  • Review the results of the audit
  • Begin identifying potential barriers to implementation
  • Homework is to complete the identification
slide-51
SLIDE 51

Workshop #3

  • Review a set of design assertions developed by the external facilitator
slide-52
SLIDE 52

Workshop #3

  • Review a set of design assertions developed by the external facilitator
  • Review a system design schematic developed by the external

facilitator (built up step by step with explanations)

slide-53
SLIDE 53

What Does the System Look Like?

Pre- and In-service Professional Learning State District Teacher Student Locus of control State & District Policymakers Daily Lesson Planning Curriculum Standards Daily Instruction Major Unit Planning Daily Formative Assessment State Summative Assessment Summative Course Finals Minor-Unit Modular Interims Major-Unit Modular Interims Learning Theory Minor Unit Planning

3 mid-term exams or papers Developed by district Timed by teacher Graded homework Quizzes Final exam, project, paper, or performance Developed by district Adopted/adapted by teacher

slide-54
SLIDE 54

Workshop #3

  • Review a set of design assertions developed by the external facilitator
  • Review a system design schematic developed by the external

facilitator (built up step by step with explanations)

  • Review the potential barriers, and develop potential solutions
slide-55
SLIDE 55

Workshop #3

  • Review a set of design assertions developed by the external facilitator
  • Review a system design schematic developed by the external

facilitator (built up step by step with explanations)

  • Review the potential barriers, and develop potential solutions
  • Begin developing a high-level plan for implementation
slide-56
SLIDE 56

LEA Pilot Participant Perspectives

Ellen Vorenkamp, Wayne RESA, MI Assessment Consultant Steven Snead, Oakland Schools, MI – Director Curriculum & Assessment

slide-57
SLIDE 57

Dearborn Public Schools – Demographics

38 Schools serving just under 21,000 Students

  • 3 Comprehensive High Schools
  • 3 Early College/Magnet High

Schools

  • 7 Specialized High Schools (CTE,

Alternative, ETC.

  • 7 Middle Schools
  • 19 Elementary Schools

Student Demographics

  • Ethnicity
  • 3% African American
  • 2% Hispanic
  • 94% White (Arabic)
  • 47% ELL
  • 7% Special Needs
  • 76% Economically Disadvantaged
slide-58
SLIDE 58

Motivating Dearborn Public Schools Participation in the Pilot

Recently rewrote their strategic plan… ”Dearborn PSD will create a well- aligned assessment system that provides important data on teaching and learning with as little impact on instructional time as possible. Data from these assessments will be used during PLC’s to make instructional decisions.” By well-aligned we mean…

  • That the assessments used

throughout the year fit together in a logical way to support our instructional beliefs about what is important for students to learn.

  • That we balance assessing

fundamental skills as well as more challenging competencies such as critical thinking and problem solving.

  • We also need to ensure that we have

assessments to meet different purposes, ranging from informing student learning to program evaluation.

slide-59
SLIDE 59

Motivating Dearborn Public Schools Participation in the Pilot

  • We decided to engage in this pilot in the hopes that…
  • This project will help us increase our assessment literacy and provide a

framework to address the creation of a balanced assessment system

  • And, that our work with the toolkit will help us look at the quality of our

assessments, knowing that true alignment has many different dimensions, in a more critical manner.

slide-60
SLIDE 60

Outcomes for Dearborn Public Schools Participation in the Pilot

  • 21 Participants attended the Pilot sessions this past winter

“Having open, deliberate conversations about the processes taking place as a district for student assessment.” “This framework on assessment types and how they relate to standards. Collaboration time with team was hopeful.”

  • Major Take Aways…
  • Assessment Literacy is critical
  • Shared understanding of Formative Assessment Practices is a needed focus
slide-61
SLIDE 61

Bloomfield Hills Schools: Demographics, Motivation to Participate and Outcomes

  • 5,647 Students
  • 11.86% ED
  • 72.52% White
  • 6.55% EL
  • 13.6 % SE

“Your child will experience the special freedom of having the greatest number of top-quality opportunities to discover his or her potential. - BHS Superintendent Dr. Rob Glass.

  • $12,124 PP Funding
  • $169,265 Median Household Income
  • $794,600 median property value
  • 89.8% Homeownership rate

Over the past decade, BHS has become more racially, economically, and linguistically diverse.

  • BHS is making intentional moves from a district with islands of excellence to a district with systems of excellence.
  • BHS completely reorganized its central office staff; new academic, assessment, and content area leads.
  • Superintendent, Asst. Superintendent, Director of Learning Services, and Assessment coordinator all attended.
  • The pilot helped them facilitate a district-wide conversation about the purpose of their assessments.
  • Facilitation by a 3rd party was key, but eventually transitioned to internal facilitation.
  • As a result of the process, they have decided to place a strong emphasis on developing teacher use of Formative

Assessment and locally developed Common Assessments.

slide-62
SLIDE 62

Evaluation of the Pilot

slide-63
SLIDE 63

Pilot Evaluation Method

Includes: 1) Review Process

  • Documents
  • Observation

2) End of Session Surveys 3) End of Pilot Surveys 4) Review of District Toolkits & Action Plans 5) Interviews

  • ESEA Consultants
  • LEA Leads
  • Pilot Project Management Team
slide-64
SLIDE 64

Where do we go from here…discussion

slide-65
SLIDE 65

Discussion Questions we have about the toolkit and use of the toolkit

  • What questions do you still have about the Toolkit and the Process to

use (session particpants)

  • What are the strengths and challenges implicit in the Toolkit and it’s

use? (EV, SS, ER,)

  • What did we learn from the pilot (highlights impacting future

work/direction)? (JM, EV, SS, ER)

  • Do we anticipate needing to conduct another pilot? (ER, JM)
slide-66
SLIDE 66

Thank You

Executive Summary and PPT of the session is uploaded to NCSA 2018 conference site