Caveats & Disclosures Potential Conflicts of Interest - - PowerPoint PPT Presentation

caveats disclosures
SMART_READER_LITE
LIVE PREVIEW

Caveats & Disclosures Potential Conflicts of Interest - - PowerPoint PPT Presentation

Formative Assessment System for Teachers : Close the Gap with Something That Works Dr. Sarah Brown Iowa Department of Education Dr. Theodore Christ University of Minnesota & FastBridge Learning Dr. Megan E. Cox Minnesota Department of


slide-1
SLIDE 1

Formative Assessment System for Teachers:

Close the Gap with Something That Works

  • Dr. Sarah Brown

Iowa Department of Education

  • Dr. Theodore Christ

University of Minnesota & FastBridge Learning

  • Dr. Megan E. Cox

Minnesota Department of Education

slide-2
SLIDE 2

Caveats & Disclosures

  • Potential Conflicts of Interest
  • Theodore J. Christ, PhD has equity and royalty interests in, and will serve
  • n the Board of Directors for, FastBridge Learning (FBL) LLC, a company

involved in the commercialization of the Formative Assessment System for Teachers (FAST). The University of Minnesota also has equity and royalty interests in FBL LLC. These interests have been reviewed and managed by the University of Minnesota in accordance with its Conflict

  • f Interest policies.
  • Federal Funding
  • The research reported here was supported by the Institute of Education

Sciences (R324B060014, R324A090038, R305A120086, R324A130161) and Office of Special Education Programs (H327A060014-07, H327A090064, H327A100012, H327A110024, H327S150004), U.S. Department of Education, through grants to The University of

  • Minnesota. The opinions expressed are those of the authors and do not

represent views of the Institute, Office, or Department.

slide-3
SLIDE 3

Summary (What was Promised)

Instructionally relevant assessment systems can enhance opportunity and equity; especially if they are efficient and provide timely

  • information. The characteristics of such systems has eluded developers

and users of formative assessment and evaluation. This may be related to misplaced paradigmatic loyalties (computer adaptive versus nonadaptive, multiple choice versus performance-based, psychometric versus behavioral/observational), which confuse and distract educators who just want something that works. This symposium will present findings and perspectives from state-level implementations of the Formative Assessment System for Teachers (FAST; FastBridge Learning) in Iowa and Minnesota along with district- level implementations in other states.

slide-4
SLIDE 4

Summary (What was Promised)

  • Introduction (10 min, Dr. Christ)
  • Minnesota Kindergarten Entry Program (15 min, Dr. Cox)
  • Iowa State-Wide Early Warning System (15 min, Dr. Brown)
  • Using Data & Closing (10 min, Christ)
  • Questions from audience (10 min, Drs. Brown, Cox, & Christ)
slide-5
SLIDE 5

METHODS, PARADIGMS, EPISTEMOLOGY, AND PURPOSE

Data Collection, Interpretation, and Use

slide-6
SLIDE 6

Paradigms as Method

Paradigms of Assessment

  • Psychometric Assessment

– Classical Test Theory – Generalizability Theory – Item Response Theory

  • Behavioral Assessment

– Curriculum-based measurement – Curriculum-based assessment – Direct behavior ratings – Systematic direct observation

  • Diagnostic C & I Assessment

– Teacher made – Curriculum embedded

Methods of Assessment

  • Modality

– Standardized, unstandardized – Individual, group – Adaptive, Non-adaptive – Selection, production – Timed, untimed – Computer, paper, in vivo

  • Scoring

– Standardized, unstandardized – Qualitative, quantitative – Dichotomous, polytomous – Automated, person scored

  • Interpretation

– Norm-referenced – Criterion-referenced (benchmark) – Self-referenced (intra-individual)

slide-7
SLIDE 7

Paradigm as Epistemology

Paradigms of Assessment

(Serafini, 2000)

  • Assessment as Measurement

– Positivistic (knowledge is factual) – Large-scale, standardized, psychometric, selection, automated scoring, understand learned in sterile conditions

  • Assessment as Inquiry

– Constructivist (knowledge is subjective) – Small-scale, unstandardized, individualized, diagnostic C&I, production, person scored, understand learner in context

  • Assessment as Procedure

– Positivistic (knowledge is factual) – Elements of Measurement & Inquiry – Small-to-large scale, semi-standardized, psychometric w/ and authentic responses, production, person scored, emphasis on learner in context

Roles of Assessment Paradigms

(Nagy, 2000)

  • Gatekeeping
  • Accountability
  • Instructional Diagnosis
  • Instructional Diagnosis
  • Maybe

– Gatekeeping & Accountability

slide-8
SLIDE 8

Paradigms as Epistemology

Epistemology – study of “what do we know?” Realist – knowledge is objective fact Relative – subjective experience counts as knowledge Ontology – student of “how do we know?” Realist – knowledge derives discover of objective facts Relativism – knowledge derives from subjective experience

Assessment as Inquiry Assessment as Procedure Assessment as Measurement

slide-9
SLIDE 9

Paradigms as Purpose

Purpose and Use of Data

  • Formative (Interim)

– Identify Problems – Analyze Problems – Plan Solutions – Monitoring Solutions

  • Summative (Interim)

– Evaluate Problems – Evaluate Solutions

Targets of Assessment

  • Systems

– Individual, group

  • Programs

– One, many, all

  • Educators

– Individual, group

  • Students

– Individual, group

slide-10
SLIDE 10

Systems are at the Core

Problems are rarely specific to a student.

10

slide-11
SLIDE 11

Systematic Problem Solving

  • Problem Solving

– Problem Identification – Problem Analysis – Plan Development – Plan Implementation & Monitoring – Plan Evaluation

  • Problem (P)

– difference between what is expected (E) and what occurs (O)

  • Problem Solver

– someone who acts to reduced or eliminate a problem

slide-12
SLIDE 12

Systematic Processes Solve Well-Identified Problems Beginning with Core Systems.

12

Systems are at the core of improvement

slide-13
SLIDE 13

Systematic Problem Solving

  • f Common Problems

Think about the Layers/Domains

slide-14
SLIDE 14
slide-15
SLIDE 15
slide-16
SLIDE 16
slide-17
SLIDE 17

Pillar of Systems Thinking

Learn to improve with a Systematic Process that iterates across Systems, Domains and Problems.

slide-18
SLIDE 18

18

&DBR

slide-19
SLIDE 19

Paradigms

slide-20
SLIDE 20

MINNESOTA

slide-21
SLIDE 21

Minnesota’s Early Education Assessment Landscape 2002-2015

  • Statutory Requirements for Assessment

– Birth to Five

  • School Based Preschool Programs

– Kindergarten Entry

  • Kindergarten Readiness Assessment

– Reading Well By Third Grade

  • Yearly Reporting

Minnesota Department of Education - DO NOT COPY

slide-22
SLIDE 22

National Perspective

  • Early childhood historically

focused on environmental quality and process quality until system reform efforts redefined how we view children’s learning

– 50 states have early learning standards – 26 states have a formal definition of school readiness – 37 states measure children’s readiness at kindergarten entry

  • Federal initiatives provided

funding for assessment design and revision:

  • Race to the Top – Early

Learning Challenge

  • USDOE, Enhanced

Assessment Grants

  • USDOE, Preschool

Development/ Enhancement Grants

Center for Early Education Learning Outcomes, 2014 National Center for Education Statistics, 2013

slide-23
SLIDE 23

Expectations for Young Children

1995

  • NAEYC Position

Statement

  • Few states adopt school

readiness definition

Today

  • States continue ecological

perspective and include: – School Readiness definitions – Early Childhood Standards

  • 5 domains

– Lack of operational definitions Community Ready School Ready Family Ready Child Ready

slide-24
SLIDE 24

The Kindergarten Entry Assessment pilot

Two goals

– Refocus energy to classroom practice – Focus on standards

  • Empirical equivalence to standards

– Using both early learning and K standards

  • Two broad phases

– Phase 1 – alignment – Phase 2 – assessment equivalency

Nominate Phase 1 Phase 2

Minnesota Department of Education - DO NOT COPY

slide-25
SLIDE 25

Developmental Milestones

  • Constructed using the Early Childhood

Indicators of Progress

 Measures academic and non-academic skills  Criterion referenced,

  • bservational

tool

slide-26
SLIDE 26

Other tools tested

–Beginning Kindergarten Assessment –Brigance Inventory of Early Development –Desired Results Developmental Profile* –Early Learning Scale –Teaching Strategies Gold* –Work Sampling System* –Formative Assessment System for Teachers*

Minnesota Department of Education - DO NOT COPY

slide-27
SLIDE 27

Method

Mixed method approach

  • Tool-to-standard crosswalks
  • Content validation process
  • Claim and evidence

summaries

  • Technical adequacy
  • Empirical alignment to

standards

  • Relative difficulty
  • Concurrent calibration
  • Standard setting

Sample

  • Phase 1: 1,694
  • Phase 2: 1,504
  • Mean age = 5.78 years
  • 78% white
  • 94% English speakers

Minnesota Department of Education - DO NOT COPY

slide-28
SLIDE 28

Pilot Phases

Preliminary criteria Crosswalk coverage Evidentiary reports Statistical alignment

Teacher use Cadre scores Equivalency analysis Performance across groups Scalability

Phase 1 Phase 2

Tools can be piloted and re-piloted

Minnesota Department of Education - DO NOT COPY

slide-29
SLIDE 29

Conceptual Alignment

slide-30
SLIDE 30

Phase 1 - Relative Difficulty

Example Item Maps- FAST DevMilestones Cognitive

Minnesota Department of Education - DO NOT COPY

slide-31
SLIDE 31

Example Item Maps- FAST Language Literacy

Phase 1 - Relative Difficulty

Minnesota Department of Education - DO NOT COPY

slide-32
SLIDE 32

Phase 2 – Concurrent Calibration

.5

  • 1.5
  • .5

1.5 2.5 3.5

  • 2.5
  • 3.5

4.5 5.5

1 1 2 2 3 3 4 4 5 5 6 6

DRDP

Language, Literacy, and Communication

1 1 2 2 3 34 4 C C O O N N S S

FAST GOLD

1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8

WSS

1 1 2 2 3 3

slide-33
SLIDE 33

Phase 2 – Concurrent Calibration

2

  • 2

4 6 8

  • 4
  • 6

10

1 1 2 2 3 3 4 4 5 5 6 6

DRDP FAST GOLD WSS

1 1 3 3 2 2 4 4 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 1 1 2 2 3 3

Social Emotional

slide-34
SLIDE 34

Significance

  • Links measurement and learning standards
  • Fits within EAG, PDG and RTT-ELC federal

initiatives

  • Allows districts and states choice of tool based
  • n quality criteria
  • May provide legislative support for statewide

expansion of KEAs

Minnesota Department of Education - DO NOT COPY

slide-35
SLIDE 35
slide-36
SLIDE 36

Iowa Context

  • State provided earlyReading, R-CBM, and

aReading as part of MTSS implementation in 2013-2014.

  • Legislation changes led to offering the tools

statewide.

slide-37
SLIDE 37

Iowa Context

  • Early Literacy Implementation Law

implemented in 15-16

  • Requires
  • K-3 Screening 3 times/year
  • K-3 Progress monitoring weekly for all

learners below benchmark targets

  • Screening and PM using DE Approved

Measures

slide-38
SLIDE 38

Iowa Context

'13-'14 '14-'15 '15-'16

10% Implementation 94% Implementation Over 95% implementation

slide-39
SLIDE 39

2015-2016 FAST Implementation

  • Screening:
  • 2,767,775 administrations
  • Progress Monitoring:
  • 2,043,829 administrations
  • CBM-R
  • 2,954,884 administrations
  • aReading:
  • 345,323 administrations
slide-40
SLIDE 40

EarlyReading Measures

50,000 100,000 150,000 200,000 250,000 300,000 350,000 400,000 Letter Sounds Nonsense Words Sight Words Word Segmenting Onset Sounds Letter Names Concepts of Print Sentence Reading Decodable Real Words World Blending

Administrations

Administrations

slide-41
SLIDE 41

FASTBridge Spanish

500 1,000 1,500 2,000 2,500 3,000 3,500 4,000 4,500 5,000 CBMR Syllable Reading Word Segmenting Sight Words Letter Sounds Onset Sounds Letter Names Sentence Reading Concepts of Print

Administrations

Administrations

slide-42
SLIDE 42

IGDIs Measures

5,000 10,000 15,000 20,000 25,000 30,000 35,000 40,000 Picture Naming Rhyming Sound ID WODNB First Sounds

Administrations

Administrations

slide-43
SLIDE 43

State Connections: Special Education

Iowa’s State Personnel Development Grant

  • OSEP Grant: 5 years, 2015-2020
  • Purpose: to improve practitioner ability to

diagnose, design, and deliver high-quality Specially Designed Instruction (SDI) for diverse learners so that learners with disabilities are successful.

  • Focus on Early Literacy

Using growth on FASTBridge assessments as a grant evaluation measure.

slide-44
SLIDE 44

State Connections: Accountability

Assessment and Data-Based Decision-Making

Intervention System

Leadership Universal Instruction

Infrastructure

Healthy Indicators

slide-45
SLIDE 45

Assessment System

Healthy Indicator Ideal Cut Scores Data Source Percent of learners screened with a valid and reliable universal screening tool. Intensive: 0-79% Targeted: 80-94% Universal: 95-100% Spring 2016 Screening Percent of learners not at benchmark assessed with a valid and reliable progress monitoring tool at least 90%

  • f the weeks between

screening periods. Intensive: 0-69% Targeted: 70-89% Universal: 90-100% Winter – Spring 2016 Progress Monitoring

slide-46
SLIDE 46
  • Purpose of administration
  • Testing vs. Teaching
  • Data Use

Assessment System

slide-47
SLIDE 47

Universal Instruction

Healthy Indicator Ideal Cut Scores Data Source Percent of learners at benchmark in a screening period. Intensive: 0-59% Targeted: 60-79% Universal: 80- 100% Spring 2016 Screening Percent of learners at or above benchmark in the fall remaining at or above benchmark in a subsequent screening period. Intensive: 0-84% Targeted: 85-94% Universal: 95- 100% Fall 2015 – Spring 2016 Screening

slide-48
SLIDE 48

Universal Instruction

  • Priority area of focus
  • Classwide Intervention
slide-49
SLIDE 49

Intervention System

Healthy Indicator Ideal Cut Scores Data Source Percent of learners below benchmark two consecutive screening periods receiving intervention. Intensive: 0-79% Targeted: 80-94% Universal: 95- 100% Fall 2016 Substantially Deficient designation and intervention scheduling Percent of learners below benchmark in the fall who then score at or above benchmark in a subsequent screening period. Intensive: 0-49% Targeted: 50-64% Universal: 65- 100% Fall 2015 - Spring 2016 Screening

slide-50
SLIDE 50

Challenges and Next Steps

Challenges

  • Practitioner understanding of the purpose of screening

and progress monitoring.

  • Early childhood integration.
  • Relationship to proficiency measure currently used.

Next Steps

  • Continued training.
  • Support for utilizing progress monitoring data to make

instructional decisions.

slide-51
SLIDE 51

PURPOSE AS PARADIGM INTERPRETATION AND USE OF DATA

Methods, paradigms, epistemology, and purpose

slide-52
SLIDE 52

52

&DBR

slide-53
SLIDE 53

Phase 2 – Concurrent Calibration

.5

  • 1.5
  • .5

1.5 2.5 3.5

  • 2.5
  • 3.5

4.5 5.5

1 1 2 2 3 3 4 4 5 5 6 6

DRDP

Language, Literacy, and Communication

1 1 2 2 3 34 4 C C O O N N S S

FAST GOLD

1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8

WSS

1 1 2 2 3 3

slide-54
SLIDE 54

Phase 2 – Concurrent Calibration

2

  • 2

4 6 8

  • 4
  • 6

10

1 1 2 2 3 3 4 4 5 5 6 6

DRDP FAST GOLD WSS

1 1 3 3 2 2 4 4 1 1 2 2 3 3 4 4 5 5 6 6 7 7 8 8 1 1 2 2 3 3

Social Emotional & DBR

slide-55
SLIDE 55

Purpose as Paradigm

Data Collection Interpretation and Use

55

slide-56
SLIDE 56

Purpose as Paradigm

Purpose and Use of Data

  • Formative (Interim)

– Identify Problems – Analyze Problems – Plan Solutions – Monitoring Solutions

  • Summative (Interim)

– Evaluate Problems – Evaluate Solutions

Targets of Assessment

  • Systems

– Individual, group

  • Programs

– One, many, all

  • Educators

– Individual, group

  • Students

– Individual, group

slide-57
SLIDE 57

Purpose as Paradigm

slide-58
SLIDE 58

Easy Button (Example)

  • There is so much, what should I use?

– FAST Reading

– aReading (branches to…) – AUTOreading – CBMreading (or earlyReading)

– FAST Math

– aMath (branches to …) – CBMmath Automaticity (or earlyMath) – CBMmath Process (or earlyMath)

– FAST SEL (social-emotional learning)

slide-59
SLIDE 59

MANY NEEDS, MANY PURPOSES, MANY PARADIGMS… PARADIGM AS PURPOSE. NOTHING LESS WILL DO.

Multi-Purpose Multi-Method Multi-Source Multi-Paradigm

70