a new vision of assessment
play

A New Vision of Assessment Texts Worth Reading Problems Worth - PowerPoint PPT Presentation

A New Vision of Assessment Texts Worth Reading Problems Worth Solving Tests Worth Taking NCSA June 2013 1 PARCC States PARCC Priorities 1. Determine whether students are college and career ready or on track 2. Connect to the Common Core


  1. A New Vision of Assessment Texts Worth Reading Problems Worth Solving Tests Worth Taking NCSA June 2013 1

  2. PARCC States

  3. PARCC Priorities 1. Determine whether students are college and career ready or on track 2. Connect to the Common Core State Standards 3. Measure the full range of student performance, including that of high- and low-achieving students 4. Provide educators data throughout the year to inform instruction 5. Create innovative 21st century, technology-based assessments 6. Be affordable and sustainable 3

  4. Getting All Students College and Career Ready Ongoing student support/interventions Success In first-year, High K – 2 Grades 3 – 8 credit-bearing, School postsecondary coursework Voluntary K – 2 College readiness Targeted interventions Timely data showing assessment being score to identify who is and supports: whether ALL students developed, aligned to • State-developed 12th- are on track for college ready for college-level the Common Core State and career readiness coursework grade bridge courses Standards Professional development for educators 4

  5. Assessments ELA/Literacy and Mathematics, Grades 3 – 11 Beginning of End of School Year School Year Flexible administration Performance- Diagnostic Mid-Year End-of-Year Based Assessment Assessment Assessment Assessment Speaking and Listening Assessment Key: Optional Required 5

  6. Promoting Student Access PARCC is committed to the following principles: • Use Universal Design principles to create accessible assessments throughout every stage and component of the assessment • Minimize/eliminate features of the assessment that are irrelevant to what is being measured, so that all students can more accurately demonstrate their knowledge and skills • Measure the full range of complexity of the standards • Use technology to make all components of the assessment as accessible as possible • Conduct bias and sensitivity reviews of all PARCC items

  7. PARCC Accessibility System Features for All Students Tools embedded in the test platform Accessibility Features for All Students Identified in advance Accommodations for Students with Disabilities, English Learners, and English Learners with Disabilities 7

  8. Accessibility Features for All Students • Features that PARCC will make available to all students , either through the online platform or through the test administration process. • Each student should determine whether they wish to use the support on an item-by-item basis, based on the supports they use during instruction and in daily life. • Some features must be identified in advance as part of the student’s PNP because of the concern of student overload or clash of supports. • All of these features are based on research and universal design features principals. 8

  9. Accessibility Features for All Students Embedded Features Audio Amplification Blank Paper (provided by test administrator) Eliminate Answer Choices Flag Items for Review General Administration Directions Clarified (by test administrator) General Administration Directions Read Aloud and Repeated as Needed (by test administrator) Highlight Tool Headphones Magnification/Enlargement Device NotePad Pop-up Glossary Redirect Student to the Test (by test administrator) Spell Checker Writing Tools 9

  10. Embedded Features Demonstration Example of “eliminate answer choice.” *NOTE: NOT a PARCC item. Not on the PARCC delivery platform.

  11. Embedded Features Demonstration Example of “highlighting.” *NOTE: NOT a PARCC item. Not on the PARCC delivery platform.

  12. Accessibility Features to be Selected in Advance Accessibility Features Adaptive and Specialized Equipment or Furniture Answer Masking Background/Font Color (Color Contrast) General Masking Line Reader Tool Text-to-Speech for the Mathematics Assessments 12

  13. The PARCC Assessment System: Design, Development and Critical Advances 13

  14. Model Content Frameworks and Assessment Development • The Model Content Frameworks were developed through a state-led process that included content experts from PARCC member states and members of the Common Core State Standards writing team. • The Model Content Frameworks were constructed based on the Common Core State Standards for use in guiding and framing item development for the PARCC assessment. 6

  15. What is Different About PARCC ’ s Development Process? • PARCC states first developed the Model Content Frameworks to provide guidance on key elements of excellent instruction aligned with the Standards. • Then, those Frameworks informed the assessment blueprint design. So, for the first time. . . • PARCC is communicating in the same voice to teachers as it is to assessment developers! • PARCC is designing the assessments around exactly the same critical content the standards expect of teachers and students. 15

  16. Evidence-Centered Design (ECD) for the PARCC Assessments Model Content Frameworks Evidence Statements Tasks To make claims about what students know, we Based on analysis, Tasks are designed must operationalize evidence drive task to elicit specific the standards development evidence from students 16

  17. Item Development • Item development began in fall 2012 • Item and passage reviews take place regularly, with teams of reviewers: o K-12 content experts o Higher education faculty o Local educators o Community members • Item development is on schedule, and the vendors will meet the August 30 benchmark to complete all items for field testing. 17

  18. PARCC Cognitive Complexity Framework Guides Item Development • CCSS demand a new type of cognitive complexity framework. • PARCC partnered with the Item Development contractors to develop a new cognitive complexity framework. • New framework : o Provides a systematic, replicable method for determining item cognitive complexity o Provides a measurement precision at all levels of the test score scales o Enables development of test forms with adequate score reliability to support achievement growth interpretations

  19. PARCC’s Cognitive Complexity Framework for ELA/Literacy • The Cognitive Complexity Framework guides item development and recognizes that text complexity and item/task complexity interact to determine the overall complexity of a task. • For the reading claim, the performance levels at each grade level are differentiated by three factors: (1) text complexity; (2) the range of accuracy in expressing reading comprehension demonstrated in student responses; and (3) the quality of evidence cited from sources read • For the writing claim, PLDs are written for the two sub- claims: (1) written expression, and (2) knowledge of language and conventions 19

  20. Claims Driving Design: ELA/Literacy Students are on-track or ready for college and careers Students Students write Students read and comprehend a build and effectively when using range of sufficiently complex texts present and/or analyzing independently sources. knowledge through research and the integration, Conventions Reading Vocabulary Written Reading and comparison, Informational Interpretation Literature Expression Knowledge of Text and Use and synthesis Language of ideas. 20

  21. Item Types That Showcase Students ’ Command of Evidence with Complex Texts • Evidence-Based Selected Response (EBSR) — Combines a traditional selected-response question with a second selected-response question that asks students to show evidence from the text that supports the answer they provided to the first question. Underscores the importance of Reading Anchor Standard 1 for implementation of the CCSS. • Technology-Enhanced Constructed Response (TECR) — Uses technology to capture student comprehension of texts in authentic ways that have been difficult to score by machine for large scale assessments (e.g., drag and drop, cut and paste, shade text, move items to show relationships). • Range of Prose Constructed Responses (PCR) — Elicits evidence that students have understood a text or texts they have read and can communicate that understanding well both in terms of written expression and knowledge of language and conventions. There are four of these items of varying types on each annual performance-based assessment. 21

  22. Research Simulation Task (Grade 7): Amelia Earhart ’ s Disappearance 22

  23. Understanding the Research Simulation Task • Session 1: o Students begin by reading an anchor text that introduces the topic. EBSR and TECR items ask students to gather key details about the passage to support their understanding. o Then, they write a summary or short analysis of the piece. • Session 2: o Students read two additional sources (may include a multimedia text) and answer a few questions about each text to learn more about the topic so they are ready to write the final essay and to show their reading comprehension. o Finally, students mirror the research process by synthesizing their understandings into an analytic essay using textual evidence from several of the sources. 23

  24. Texts Worth Reading? • Range: Example of assessing reading across the disciplines and helping to satisfy the 55%-45% split of informational text to literature at the 6-8 grade band. • Quality: The texts on Amelia Earhart represent content-rich nonfiction on a topic that is historically significant. • Complexity: Quantitatively and qualitatively, the passages have been validated and deemed suitable for use at grade 7. 24

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend