caveats disclosures
play

Caveats & Disclosures Potential Conflicts of Interest - PowerPoint PPT Presentation

Formative Assessment System for Teachers : Close the Gap with Something That Works Dr. Sarah Brown Iowa Department of Education Dr. Theodore Christ University of Minnesota & FastBridge Learning Dr. Megan E. Cox Minnesota Department of


  1. Formative Assessment System for Teachers : Close the Gap with Something That Works Dr. Sarah Brown Iowa Department of Education Dr. Theodore Christ University of Minnesota & FastBridge Learning Dr. Megan E. Cox Minnesota Department of Education

  2. Caveats & Disclosures • Potential Conflicts of Interest • Theodore J. Christ, PhD has equity and royalty interests in, and will serve on the Board of Directors for, FastBridge Learning (FBL) LLC, a company involved in the commercialization of the Formative Assessment System for Teachers (FAST). The University of Minnesota also has equity and royalty interests in FBL LLC. These interests have been reviewed and managed by the University of Minnesota in accordance with its Conflict of Interest policies. • Federal Funding • The research reported here was supported by the Institute of Education Sciences (R324B060014, R324A090038, R305A120086, R324A130161) and Office of Special Education Programs (H327A060014-07, H327A090064, H327A100012, H327A110024, H327S150004), U.S. Department of Education , through grants to The University of Minnesota. The opinions expressed are those of the authors and do not represent views of the Institute, Office, or Department.

  3. Summary (What was Promised) Instructionally relevant assessment systems can enhance opportunity and equity; especially if they are efficient and provide timely information . The characteristics of such systems has eluded developers and users of formative assessment and evaluation. This may be related to misplaced paradigmatic loyalties (computer adaptive versus nonadaptive, multiple choice versus performance-based, psychometric versus behavioral/observational), which confuse and distract educators who just want something that works. This symposium will present findings and perspectives from state-level implementations of the Formative Assessment System for Teachers (FAST; FastBridge Learning ) in Iowa and Minnesota along with district- level implementations in other states.

  4. Summary (What was Promised) • Introduction (10 min, Dr. Christ) • Minnesota Kindergarten Entry Program (15 min, Dr. Cox) • Iowa State-Wide Early Warning System (15 min, Dr. Brown) • Using Data & Closing (10 min, Christ) • Questions from audience (10 min, Drs. Brown, Cox, & Christ)

  5. Data Collection, Interpretation, and Use METHODS, PARADIGMS, EPISTEMOLOGY, AND PURPOSE

  6. Paradigms as Method Paradigms of Assessment Methods of Assessment • Psychometric Assessment Modality • – Classical Test Theory Standardized, unstandardized – – Generalizability Theory – Individual, group – Item Response Theory – Adaptive, Non-adaptive – Selection, production – Timed, untimed • Behavioral Assessment – Computer, paper, in vivo – Curriculum-based measurement Scoring • – Curriculum-based assessment Standardized, unstandardized – – Direct behavior ratings – Qualitative, quantitative – Systematic direct observation – Dichotomous, polytomous – Automated, person scored • Diagnostic C & I Assessment • Interpretation – Teacher made – Norm-referenced – Curriculum embedded – Criterion-referenced (benchmark) Self-referenced (intra-individual) –

  7. Paradigm as Epistemology Paradigms of Assessment Roles of Assessment Paradigms (Serafini, 2000) (Nagy, 2000) • Assessment as Measurement • Gatekeeping – Positivistic (knowledge is factual) • Accountability – Large-scale, standardized, psychometric, selection, automated scoring, understand learned in sterile conditions • Assessment as Inquiry • Instructional Diagnosis – Constructivist (knowledge is subjective) – Small-scale, unstandardized, individualized, diagnostic C&I, production, person scored, understand learner in context • Assessment as Procedure • Instructional Diagnosis – Positivistic (knowledge is factual) • Maybe – Elements of Measurement & Inquiry – Gatekeeping & Accountability – Small-to-large scale, semi-standardized, psychometric w/ and authentic responses, production, person scored, emphasis on learner in context

  8. Paradigms as Epistemology Assessment as Procedure Assessment as Inquiry Assessment as Measurement Epistemology – study of “what do we know?” Realist – knowledge is objective fact Relative – subjective experience counts as knowledge Ontology – student of “how do we know?” Realist – knowledge derives discover of objective facts Relativism – knowledge derives from subjective experience

  9. Paradigms as Purpose Purpose and Use of Data Targets of Assessment • Formative (Interim) • Systems – Identify Problems – Individual, group – Analyze Problems • Programs – Plan Solutions – One, many, all – Monitoring Solutions • Educators – Individual, group • Summative (Interim) • Students – Evaluate Problems – Individual, group – Evaluate Solutions

  10. Systems are at the Core Problems are rarely specific to a student. 10

  11. Systematic Problem Solving • Problem Solving – Problem Identification – Problem Analysis – Plan Development – Plan Implementation & Monitoring – Plan Evaluation • Problem (P) – difference between what is expected (E) and what occurs (O) • Problem Solver – someone who acts to reduced or eliminate a problem

  12. Systematic Processes Solve Well-Identified Problems Beginning with Core Systems. Systems are at the core of improvement 12

  13. Systematic Problem Solving of Common Problems Think about the Layers/Domains

  14. Pillar of Systems Thinking Learn to improve with a Systematic Process that iterates across Systems, Domains and Problems.

  15. & DBR 18

  16. Paradigms

  17. MINNESOTA

  18. Minnesota’s Early Education Assessment Landscape 2002-2015 • Statutory Requirements for Assessment – Birth to Five • School Based Preschool Programs – Kindergarten Entry • Kindergarten Readiness Assessment – Reading Well By Third Grade • Yearly Reporting Minnesota Department of Education - DO NOT COPY

  19. National Perspective • Early childhood historically • Federal initiatives provided focused on environmental funding for assessment quality and process quality design and revision: until system reform efforts • Race to the Top – Early redefined how we view Learning Challenge children’s learning • USDOE, Enhanced – 50 states have early learning Assessment Grants standards • USDOE, Preschool – 26 states have a formal Development/ definition of school readiness Enhancement Grants – 37 states measure children’s readiness at kindergarten entry Center for Early Education Learning Outcomes, 2014 National Center for Education Statistics, 2013

  20. Expectations for Young Children Today 1995 • States continue ecological • NAEYC Position perspective and include: Statement – School Readiness • Few states adopt school definitions readiness definition – Early Childhood Standards Community Ready • 5 domains School Ready – Lack of operational definitions Family Ready Child Ready

  21. The Kindergarten Entry Assessment pilot Two goals Nominate – Refocus energy to classroom practice – Focus on standards • Empirical equivalence to standards Phase 1 – Using both early learning and K standards • Two broad phases Phase 2 – Phase 1 – alignment – Phase 2 – assessment equivalency Minnesota Department of Education - DO NOT COPY

  22. Developmental Milestones • Constructed using the Early Childhood Indicators of Progress  Measures academic and non-academic skills  Criterion referenced, observational tool

  23. Other tools tested – Beginning Kindergarten Assessment – Brigance Inventory of Early Development – Desired Results Developmental Profile* – Early Learning Scale – Teaching Strategies Gold* – Work Sampling System* – Formative Assessment System for Teachers* Minnesota Department of Education - DO NOT COPY

  24. Method Mixed method approach Sample • Tool-to-standard crosswalks • Phase 1: 1,694 • Content validation process • Phase 2: 1,504 • Claim and evidence summaries • Mean age = 5.78 years • Technical adequacy • 78% white • Empirical alignment to • 94% English speakers standards • Relative difficulty • Concurrent calibration • Standard setting Minnesota Department of Education - DO NOT COPY

  25. Pilot Phases Preliminary criteria Tools can be piloted and re-piloted Phase Statistical Crosswalk alignment coverage 1 Teacher use Cadre scores Evidentiary reports Phase Equivalency Scalability analysis 2 Performance across groups Minnesota Department of Education - DO NOT COPY

  26. Conceptual Alignment

  27. Phase 1 - Relative Difficulty Example Item Maps- FAST DevMilestones Cognitive Minnesota Department of Education - DO NOT COPY

  28. Phase 1 - Relative Difficulty Example Item Maps- FAST Language Literacy Minnesota Department of Education - DO NOT COPY

  29. Phase 2 – Concurrent Calibration Language, Literacy, and Communication -3.5 -1.5 -.5 .5 -2.5 1.5 2.5 3.5 4.5 5.5 DRDP 1 2 1 3 2 3 4 6 4 5 5 6 FAST 0 0 1 1 2 3 2 34 4 O N C S O C N S GOLD 0 2 1 3 0 1 2 3 4 5 6 4 5 6 7 8 7 8 WSS 1 2 1 2 3 3

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend