On Linking Learning, Assessment, and Interpretation Min Li Min Li - - PowerPoint PPT Presentation

on linking learning assessment and interpretation
SMART_READER_LITE
LIVE PREVIEW

On Linking Learning, Assessment, and Interpretation Min Li Min Li - - PowerPoint PPT Presentation

On Linking Learning, Assessment, and Interpretation Min Li Min Li Stanford University Stanford University Invited Talk University of Washington April 26 th , 2001 Overview of Talk Conceptual framework: The assessment triangle


slide-1
SLIDE 1

On Linking Learning, Assessment, and Interpretation

Min Li Min Li

Stanford University Stanford University

Invited Talk University of Washington April 26th, 2001

slide-2
SLIDE 2

Overview of Talk

  • Conceptual framework: The assessment triangle
  • Dissertation study – evaluating the links between

science achievement to assessments

  • Classroom assessment study – looking into students’

learning in the classroom through Science journals

  • Directions and implications
slide-3
SLIDE 3

Conceptual Framework: The Assessment Triangle

slide-4
SLIDE 4

The Assessment Triangle

Learning/Achievement (cognition) Assessment Interpretation

Pellegrino, Chudowsky, & Glaser, in press

slide-5
SLIDE 5

Dissertation Study: Evaluating the Link between Science Achievement to Assessments

slide-6
SLIDE 6

The Assessment Triangle: Dissertation Study

Science achievement as four types of knowledge:

  • Declarative
  • Procedural
  • Schematic
  • Strategic

Assessment methods:

  • Multiple-choice
  • Free-response
  • Performance-assessment

Evaluate Interpretations:

  • Logically
  • Empirically
slide-7
SLIDE 7

Defining Science Achievement

Science Achievement Science Achievement

Definitions, terms & facts Steps, actions, & algorithms Theories & mental models Strategies & conditional knowledge

Declarative Procedural Schematic Strategic

slide-8
SLIDE 8

Linking Knowledge Types to Assessment Methods

Assessment Method

Multiple-choice, free-response Performance assessment Multiple-choice, free-response, concept-mapping Performance assessment with an open structure

Declarative

Knowing that

Procedural

Knowing how

Schematic

Knowing why

Strategic

Knowing about knowing

Knowledge Type

slide-9
SLIDE 9

Method: Sample

  • TIMSS science items (American Pop 2)

– Booklet 8

  • 22 multiple-choice
  • 10 free-response

– Performance assessment tasks

  • Ten experts selected for think-aloud study

– 5 physics and 5 biology graduate students

slide-10
SLIDE 10

Method: Procedure

The link was evaluated logically and empirically: The link was evaluated logically and empirically:

  • Logical analysis

– Coding the characteristics of items and linking items to knowledge types

  • Empirical analysis

– Exploratory and confirmatory factor analysis to examine the underlying covariance patterns – Think-aloud study to infer students’ cognition

slide-11
SLIDE 11

Logical Analysis:

Classification of TIMSS Items by Knowledge Type

13 5 6 4 3 1 Declarative Schematic Procedural

  • Strategic

Number of Items

slide-12
SLIDE 12

Example: A Declarative-knowledge Item

  • P4. What happens when an animal hibernates?
  • A. There is no life in any of its parts.
  • B. It stops breathing.
  • C. Its temperature is higher than when it is active.
  • D. It is absorbing energy for use when it is active.
  • E. It is using less energy than when it is active.
slide-13
SLIDE 13

Example: A Procedural-knowledge Item

  • P1. The graph shows the progress made by an ant moving

along a straight line. If the ant keeps moving at the same speed, how far will it have traveled at the end of 30 seconds?

  • A. 5cm
  • B. 6cm
  • C. 20cm
  • D. 30cm
slide-14
SLIDE 14

Example: A Schematic-knowledge Item

  • P5. The water in a tube is heated, as shown in the
  • diagram. As the water is heated, the balloon increases

in size. Explain why.

slide-15
SLIDE 15

Factor Analysis

A good statistical fit indicated by fit measures:

  • χ2=357.47,

df=333, P=.17

  • CFI=.999

0, .41

Declarative

.72

bsmsa7

0, .17

e1

1

.80

bsmsa9

0, .12

e2

1

.60

bsmsa11

0, .19

e3

1

0, .05

Schematic

.50

bsmsq11

0, .22

e20 1

.41

bsssp5

0, .22

e19 1

.23

bsssp2

0, .12

e18

.15

bsmsb3

0, .12

e17 1

.52

bsmsa12

0, .20

e15 1

.76

bsmsa8

0, .16

e14 1

0, .03

Procedural

.66

bsmsr1

0, .21

e28

1 .59

bsmsp7

0, .21

e27

.84

bsmsp1

0, .10

e26

1

.49

bsmsb5

0, .24

e25

1

.53

bsmsq13

0, .21

e21

.69

bsssq18

0, .71

e23 1

.47

bsmsr2

0, .23

e24 1

.91

bsmsb4

0, .07

e5

.88

bsssp3

0, .09

e6

1.23

bsssp6

0, .49

e8

.51

bsmsq14

0, .22

e9

.58

bsssq17

0, .21

e10

.45

bsssr4

0, .18

e12

.61

bsssr5

0, .14

e13

1 1 1 1 1 1

1

.25

1

.12

1

.55 .73 .17 1.00

.54

bsmsp4

0, .21

e7

1

  • .03

.02

.31 .27 .32 .36 .18 .40

bsmsq15

0, .22

e22 1

.84

bsesr3

0, .36

e11 1

.58 .01

1

.02

1.00 .56 1.02 .52

.03 .10

.73

.91

bsmsb1

0, .08

e4 1 .10

  • .23

.18 .32

.62

bsmsb2

0, .23

e16 1

  • .34
  • .22

.48 .33 .47 .28 1.00 .27 .69 .94 .66 .25 1.70

.01

  • .33

.11

.02

.22

slide-16
SLIDE 16

Factor Analysis

  • Knowledge-type items clustered

together, showing significant loadings on the three knowledge factors.

  • Declarative, procedural, and

schematic knowledge factors were highly correlated.

  • Comparison with alternative

models (e.g., one general factor) favored the knowledge-factor model.

0, .41

Declarative

.72

bsmsa7

0, .17

e1

1

.80

bsmsa9

0, .12

e2

1

.60

bsmsa11

0, .19

e3

1

0, .05

Schematic

.50

bsmsq11

0, .22

e20 1

.41

bsssp5

0, .22

e19 1

.23

bsssp2

0, .12

e18

.15

bsmsb3

0, .12

e17 1

.52

bsmsa12

0, .20

e15 1

.76

bsmsa8

0, .16

e14 1

0, .03

Procedural

.66

bsmsr1

0, .21

e28

1 .59

bsmsp7

0, .21

e27

.84

bsmsp1

0, .10

e26

1

.49

bsmsb5

0, .24

e25

1

.53

bsmsq13

0, .21

e21

.69

bsssq18

0, .71

e23 1

.47

bsmsr2

0, .23

e24 1

.91

bsmsb4

0, .07

e5

.88

bsssp3

0, .09

e6

1.23

bsssp6

0, .49

e8

.51

bsmsq14

0, .22

e9

.58

bsssq17

0, .21

e10

.45

bsssr4

0, .18

e12

.61

bsssr5

0, .14

e13

1 1 1 1 1 1

1

.25

1

.12

1

.55 .73 .17 1.00

.54

bsmsp4

0, .21

e7

1

  • .03

.02

.31 .27 .32 .36 .18 .40

bsmsq15

0, .22

e22 1

.84

bsesr3

0, .36

e11 1

.58 .01

1

.02

1.00 .56 1.02 .52

.03 .10

.73

.91

bsmsb1

0, .08

e4 1 .10

  • .23

.18 .32

.62

bsmsb2

0, .23

e16 1

  • .34
  • .22

.48 .33 .47 .28 1.00 .27 .69 .94 .66 .25 1.70

.01

  • .33

.11

.02

.22

slide-17
SLIDE 17

Protocol Analysis

Examples of Experts’ Verbalizations

  • P4. What happens when an animal hibernates?
  • A. There is no life in any of its parts.
  • B. It stops breathing.
  • C. Its temperature is higher than when it is active.
  • D. It is absorbing energy for use when it is active.
  • E. It is using less energy than when it is active.

The declarative-knowledge item

slide-18
SLIDE 18

Protocol Analysis

Examples of Experts’ Verbalizations

Expert 1: “What happens when an animal hibernates? Okay, what I know, hibernation means, sleeps for a long time and heart rate slows down …” Expert 2: “The answer is just said there (from reading the statement)… all the process slows down. Well, I know when animals hibernate, they lay down and they do not use too much energy.”

slide-19
SLIDE 19

Protocol Analysis

Examples of Experts’ Verbalizations

  • P5. The water in a tube is heated, as shown in the
  • diagram. As the water is heated, the balloon increases

in size. Explain why. The schematic The schematic-

  • knowledge item

knowledge item

slide-20
SLIDE 20

Protocol Analysis

Examples of Experts’ Verbalizations

Expert 1: “… when it is hotter, the atoms move faster. So they evaporate, so there is more vapor. That vapor, that is more pressure on the balloon…” Expert 2: “The balloon increases because of pressure, that is what causes the balloon expands. And also, when you heat something, even without water, when you heat gas, pressure of the volume tends to increase.”

slide-21
SLIDE 21

Conclusions of Dissertation Study

  • Logical and factor analyses supported the

distinctions between knowledge types.

  • Protocol analysis revealed differences in use of

knowledge types partially due to item characteristics.

slide-22
SLIDE 22

Classroom Assessment Study: Classroom Assessment Study: Looking into Students’ Learning Looking into Students’ Learning in the Classroom in the Classroom through Science Journals through Science Journals

slide-23
SLIDE 23

Science journal is a compilation of entries that provides a partial record of the instructional experiences a student had in her classroom during a certain period of time.

slide-24
SLIDE 24

The Assessment Triangle: Science Journals

Learning/Achievement Learning/Achievement Journals as Journals as Assessment Tools: Assessment Tools:

  • at the individual level and at the

aggregated classroom level.

  • an immediate/unobtrusive assessment
  • Opportunities to learn
  • Instructional implementation
  • Quality of teacher feedback
  • Student performance
  • Scientific communication
  • Conceptual understanding
  • Procedural understanding
slide-25
SLIDE 25

The Assessment Triangle: Science Journals

Learning/Achievement Learning/Achievement Journals as Journals as Assessment Tools

Interpretation Interpretation

Assessment Tools

  • Can science journals provide trustworthy and valid

evidence on student performance?

  • What do journals tell us about student performance?
  • What do journals tell us about opportunity to learn?
slide-26
SLIDE 26

Method

  • Sample

– 10 fifth grade classrooms – Two Full Option Science System (FOSS) units, Variables in the fall and Mixtures in the spring, were taught. – A random stratified sample from each class: 2 low, 2 middle, and 2 high

slide-27
SLIDE 27

Method

  • Coding

– Each entry was coded into different scores:

  • Instructional implementation
  • Type of entry
  • Student performance
  • Teacher feedback

– An analytic coding criteria defined the values.

slide-28
SLIDE 28

Method

  • Procedures

– Pre-posttest design using performance assessments – 28 Variables and 22 Mixtures journals were coded by two coders.

slide-29
SLIDE 29

Technical Characteristics

  • f Journal Scores

Reliability Reliability

Type of entry Student performance Teacher feedback Variables Mixtures 85 85 .85 .84 .91 .85 % of Agreement Intercoder Reliability

Validity Validity

Students’ journal scores were correlated with their performance assessment scores (on average r = .52).

slide-30
SLIDE 30

Student Performance

0.5 1 1.5 2 2.5 3

Scientifc communication Coneptual understanding Procedural understanding

Variables in the fall Mixtures in the spring

Score

slide-31
SLIDE 31

Opportunity to Learn: Learning Activities

5 10 15 20 25

Defining Exemplifying Applying concepts

Variables Mixtures

Percentage

slide-32
SLIDE 32

5 10 15 20 25 Defining Exemplifying Applying Concepts Variables Mixtures

5 10 15 20 25 30 35 40

Predicting Results Interpreting Res & interpret Procedures Experiments Designing

Opportunity to Learn: Learning Activities

Percentage

slide-33
SLIDE 33

Opportunity to Learn: Teacher Feedback

  • Teachers did not provide feedback despite errors or

misconceptions that were evident in the students’ journals.

  • Only 4 among the 10 teachers provided feedback!

No feedback Wrong feedback Feedback on how to improve Grades/ Check mark

10 20 30 40 50

Percentage 4 feedback classrooms

slide-34
SLIDE 34

Conclusions of Classroom Assessment Study

  • Science journals can be reliably scored and be used as a

valid assessment tool.

  • Students did poorly in scientific communication and

showed partial science understanding in their journals.

  • Most teachers did not effectively use science journals.
  • Teachers had very limited content knowledge. They did

not know how to promote or assess student learning.

slide-35
SLIDE 35

Directions and Implications Directions and Implications

slide-36
SLIDE 36

Directions and Implications

Dissertation Study Dissertation Study

  • What other assessment methods can tap into these

knowledge types?

  • How can we measure types of knowledge at the

classroom level?

  • How can we use the distinctions between knowledge

types to improve teachers’ practice?

  • How can we use these distinctions to promote students’

learning?

slide-37
SLIDE 37

Directions and Implications

Classroom Assessment Study Classroom Assessment Study

  • Can journals be used to gather information on

teaching for the accountability purposes?

  • How can we make the use of journals relevant to

and practical for teachers in their practice?

  • How can we use these findings to improve teachers’

classroom assessment practices?

slide-38
SLIDE 38

Directions and Implications

My Research on Assessment My Research on Assessment Learning/Achievement (cognition) Assessment Interpretation

slide-39
SLIDE 39

THANK YOU! I look forward to your questions and comments.

.