Assessing NSF Programming-- -- Assessing NSF Programming - - PowerPoint PPT Presentation

assessing nsf programming assessing nsf programming
SMART_READER_LITE
LIVE PREVIEW

Assessing NSF Programming-- -- Assessing NSF Programming - - PowerPoint PPT Presentation

Assessing NSF Programming-- -- Assessing NSF Programming Standards- -Based Reform Based Reform Standards Assessment Technologies: Assessment Technologies: The San Francisco Project The San Francisco Project Richard J. Shavelson and Maria


slide-1
SLIDE 1

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

1

Assessing NSF Programming Assessing NSF Programming--

  • Standards

Standards-

  • Based Reform

Based Reform Assessment Technologies: Assessment Technologies: The San Francisco Project The San Francisco Project

Richard J. Shavelson and Maria Araceli Ruiz-Primo

Stanford University

Performance Effectiveness Review National Science Foundation January 27, 1999

slide-2
SLIDE 2

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

2

Achievement Indicators: Achievement Indicators: Instructional Sensitivity Instructional Sensitivity

Classroom Instruction

Outside School Influences

Close: “Embedded” Assessments -- A Slightly More Advanced Activity in Unit Proximal: Same Concept/Principle--New Investigation Distal: Large-Scale Performance Assessment from State/ National Curriculum Framework Immediate: Lab Notebooks & Classroom Tests

Depth of Assessment Probe

Remote: Standardized National Science Achievement Tests

slide-3
SLIDE 3

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

3

Agenda Agenda

PART I: Framework for Evaluating Science Education Reform

  • A Working (and Evolving) Definition of Science Achievement
  • Linking Assessments to Components of Achievement
  • Multilevel Achievement Assessment

PART II: The San Francisco Study

  • The Proximity of the Assessments
  • The Study
  • The Findings

PART III: Concluding Comments

  • What We Have Learned
  • The Larger Picture
slide-4
SLIDE 4

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

4

Toward An Achievement Framework: Toward An Achievement Framework: Knowledge Components Knowledge Components

Declarative Procedural Strategic Knowledge Knowledge Knowledge

Cognitive Cognitive Tools: Tools:

Planning Planning Monitoring Monitoring

(Knowing the “that”) (Knowing the “how”) (Knowing the “which,” “when,” and “why”)

Domain-specific content:

  • facts
  • concepts
  • principles

Production system-- condition- action rules Problem schemata/ strategies/

  • peration systems

Characteristics That Vary According to Proficiency Level

Extent

(How much?)

Structure

(How is it organized?)

Others

(Precision? Efficiency? Automaticity?)

Low High

slide-5
SLIDE 5

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

5

Linking Assessments to Linking Assessments to Achievement Components Achievement Components

Declarative Procedural Strategic Knowledge Knowledge Knowledge

Performance Assessments Conceptual Maps Performance Assessments Multiple-Choice Procedural Maps Models/ Mental Maps

Extent Structure

Others

slide-6
SLIDE 6

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

6

Achievement Indicators: Achievement Indicators: Instructional Sensitivity Instructional Sensitivity

Classroom Instruction

Outside School Influences

Close: “Embedded” Assessments -- A Slightly More Advanced Activity in Unit Proximal: Same Concept/Principle--New Investigation Distal: Large-Scale Performance Assessment from State/ National Curriculum Framework Immediate: Lab Notebooks & Classroom Tests

Depth of Assessment Probe

Remote: Standardized National Science Achievement Tests

slide-7
SLIDE 7

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

7

Variables Unit: Swingers Activity Variables Unit: Swingers Activity

Goal: Gain experience with the concepts of system, variable, and controlling and manipulating variables Assessment Task: Same concept and (high) structure, slightly different materials and measurement method Unit Activity Close Assessment Task

release position

slide-8
SLIDE 8

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

8

Variables Unit: Lifeboats Activity Variables Unit: Lifeboats Activity

Goal: Gain experience with the concepts of system, variable, and controlling and manipulating variables Assessment Task: Same concept, different structure (low), materials, and measurement method

Unit Activity Proximal Assessment Task

slide-9
SLIDE 9

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

9

Mixtures & Solutions Unit: Mixtures & Solutions Unit: Reaching Saturation Activity Reaching Saturation Activity

Unit Activity Goal: Gain experience with the concepts of mixtures and solutions, concentration, saturation, and chemical reaction Assessment Task: Same concept, slightly different structure and materials and different measurement method Close Assessment Task

slide-10
SLIDE 10

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

10

Mixtures & Solutions Unit: Mixtures & Solutions Unit: Fizz Quiz Activity Fizz Quiz Activity

Goal: Gain experience with the concepts of mixtures and solutions, concentration, saturation, and chemical reaction Assessment Task: Same concept, slightly different materials, different (low) structure and measurement method Unit Activity Proximal Assessment Task

slide-11
SLIDE 11

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

11

CSIAC Performance Assessment: CSIAC Performance Assessment: Is All the Trash the Same? Is All the Trash the Same?

Distal Assessment Task

Sort your trash

  • Take all of the trash items out of Bag A

and put then on your placemat.

  • Observe each trash item.
  • Sort the trash items into groups based
  • n characteristics the items have in

common

  • Using all of the trash items, make at

least three groups. Do not make more than five groups.

  • Each trash item can be in only one

group

  • Draw a circle around each group on

your placemat

Identify your trash

  • Look at the “Trash Identification Table”

below.

  • Find a picture of each trash item that is
  • n your placemat.
  • Find the number that matches each

trash item in the “Trash Identification Table.”

  • Write the number for each trash item
  • n your placemat next to each item.
  • Make sure that you put a number by

every trash item.

TASK 2 Trash Chart

  • Record the information from your

placemat onto the chart below. Write your reasons for putting the trash in each group under “Reasons.”.

TASK 3 Trash Identification Table TASK 1

slide-12
SLIDE 12

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

12

Proximity Profile of the Assessments Proximity Profile of the Assessments

VARIABLES MIXTURES & SOLUTIONS

Proximal Proximal Distal Distal Close Close

Solutions Mystery Powders CSIAC Pendulum Bottles CSIAC

Purpose Content Task Directedness Materials Measurement Methods

slide-13
SLIDE 13

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

13

Study Questions Study Questions

  • Does hands-on science instruction impact students’

performance?

  • If so, does the magnitude of impact differ depending on

the proximity of the assessment to the curriculum?

  • Are findings replicable across curricular units?
slide-14
SLIDE 14

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

14

Study Design Study Design

Participants: Ten schools from a medium sized urban school district in the Bay Area. Twenty teachers and about 500 fifth-graders. Design: Replications across classrooms: Initial Student Status, Unit Implementation, Final Student Status Instrumentation:

FOSS UNITS Variables

(Fall)

Mixtures and Solutions

(Spring)

Pendulum Saturation Bottles Mystery Powders CSIAC-PA & MC Close Proximal Distal

slide-15
SLIDE 15

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

15

Within Within-

  • Classroom Design

Classroom Design

Sequence Initial Status Implementation Final Status Pretest Posttest

1 Close Science Journals Close CSIAC-PA & MC 2 Proximal Science Journals Proximal CSIAC-PA & MC

slide-16
SLIDE 16

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

16

Instructional Sensitivity: Instructional Sensitivity: Preliminary Results Preliminary Results

0.2 0.4 0.6 0.8 1 1.2 1.4 Close Proximal

Pendulum Solutions Bottles Mystery Powders Mean Gain in SD Units

slide-17
SLIDE 17

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

17

Close Assessments: Close Assessments: Effects Across Classrooms Effects Across Classrooms

0.5 1 1.5 2 2.5

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20

Pendulum Solutions

Mean Gain in SD Units

slide-18
SLIDE 18

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

18

Proximal Assessment: Proximal Assessment: Effects Across Classrooms Effects Across Classrooms

  • 0.8
  • 0.6
  • 0.4
  • 0.2

0.2 0.4 0.6 0.8 1 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 Bottles Mystery Powders

Mean Gain in SD Units

slide-19
SLIDE 19

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

19

Correlations Among Assessments Correlations Among Assessments

(Pilot Study) (Pilot Study)

Sequence Time Directedness Posttest CSIAC Directedness Posttest CSIAC

  • f Assessment

PA

  • f Assessment

PA 1 Pretest Structured .76 .64 Less-Structured .52 .32 Posttest Structured .64 Less-Structured .07 2 Pretest Less-Structured .71 .03 Structured .66 .43 Posttest Less-Structured .20 Structured .57

Mixtures & Solutions Variables

slide-20
SLIDE 20

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

20

Immediate Assessment: Immediate Assessment: Students’ Science Journals Students’ Science Journals

Unit Implementation Students Performance Teacher’s Feedback

  • What instructional activities were implemented as

reflected in the students’ journals

  • Were other appropriate additional activities

implemented?

  • Were students communications complete, focused,
  • rganized?
  • Did students’ communications indicate conceptual

and procedural understanding?

  • Did the teacher provide helpful feedback on

students’ performance?

  • Did the teacher encourage students to reflect on

their work?

slide-21
SLIDE 21

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

21

Example of a Journal Example of a Journal

Implementation of an Instructional Activity

slide-22
SLIDE 22

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

22

Variation in Implementation Variation in Implementation Across Classrooms: Mixtures & Solutions Across Classrooms: Mixtures & Solutions

(Pilot Study) (Pilot Study)

2 4 6 8 10 12 14 16 1 2 3 0.2 0.4 0.6 0.8 1 1.2 1.4 1 2 3 Close

Unit Implementation

(Max 24) Mean Score Mean Gain in SD Units

“Close” Effect Size

Class Class

slide-23
SLIDE 23

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

23

What We Have Learned What We Have Learned

  • Instruction had an impact on students’ performance
  • Results were on the predicted direction across units: Close

assessments were more sensitive to changes in students performance than Proximal assessments

  • Sensitivity of Distal assessments cannot be evaluated since

pretest data were not available

  • High variation in the quality of implementation of hands-on

science instruction

  • High between-class variation in effect sizes
  • The use of more distal or remote measures may lead to an

erroneous conclusion that the reform has no impact

  • However, if impact is only evident at the closest level, this

raises questions about the reform itself

slide-24
SLIDE 24

Stanford University Transferring New Assessment Technologies to Teachers and Other Educators (NSF-ESI- 9596080)

24

The Larger Picture The Larger Picture

  • More than one source of evidence should be used to evaluate

the impact of instruction on students’ achievement. Evidence should bear

  • n

declarative, procedural, and strategic knowledge

Warning: Attention to different types of knowledge (and corresponding assessments) may lead to changes in curriculum!

  • E&HR should support

– Curriculum developers’ efforts to put assessments into their curricula – One or more Centers that serve as a resource to educators as they attempt to build and implement assessment in teaching – Basic research into the quality of alternative assessments

  • E&HR should make assessment research a focus