Tools for Assessing Impacts on Teacher Knowledge for Mathematics - - PowerPoint PPT Presentation

tools for assessing impacts on teacher knowledge for
SMART_READER_LITE
LIVE PREVIEW

Tools for Assessing Impacts on Teacher Knowledge for Mathematics - - PowerPoint PPT Presentation

Tools for Assessing Impacts on Teacher Knowledge for Mathematics and Science Teaching Dan Heck dheck@horizon-research.com Sean Smith ssmith62@horizon-research.com Existing Tools: Science Diagnostic Teacher Assessments in Mathematics


slide-1
SLIDE 1

Tools for Assessing Impacts on Teacher Knowledge for Mathematics and Science Teaching

Dan Heck dheck@horizon-research.com Sean Smith ssmith62@horizon-research.com

slide-2
SLIDE 2

Existing Tools: Science

slide-3
SLIDE 3

Diagnostic Teacher Assessments in Mathematics and Science

  • Assessments in life, earth, and physical science (one in each

area)

  • Knowledge domains:

– declarative knowledge – scientific inquiry and procedures – schematic knowledge – pedagogical content knowledge (PCK) – science, technology, and society knowledge

  • Each form has 20 multiple choice and 5 open-ended
  • Straight content (except for PCK)
  • Available on fee basis; $7 per teacher for scoring
  • Contact Bill Bush at U of L: bill.bush@louisville.edu

http: / / louisville.edu/ edu/ crmstd/ diag_sci_assess_middle_teachers.html

slide-4
SLIDE 4

Sample Multiple Choice Item

slide-5
SLIDE 5

If a constant net force greater than zero is applied to a ball, what would you observe?

  • A. Not much, because a “net” force is always

weak.

  • B. The ball will go at a constant speed in a

straight line.

  • C. The ball speeds up, slows down, or changes

direction.

  • D. The ball will eventually explode or

disintegrate.

slide-6
SLIDE 6

Sample Open-ended Item

slide-7
SLIDE 7

After a lab that involved magnetism and compasses, a student writes that a magnet can’t function on the Moon because there are no magnetic poles on the Moon as there are on Earth. Identify this student’s misconception and describe an appropriate strategy to counteract this misconception.

slide-8
SLIDE 8

MOSART: Misconception Oriented Standards-based Assessment Resource for Teachers

(NSF Grant No. 0412382)

  • Probes for conceptual shift(s) resulting from

professional development or course work

  • Distractors based on published misconceptions
  • Each test is 20 m-c items
  • Same tests for teachers and students
  • Available at no cost
  • Contact Phil Sadler

http: / / www.cfa.harvard.edu/ smgphp/ mosart/ about_mosart.html

slide-9
SLIDE 9

Sample Item

slide-10
SLIDE 10

Sue sticks one end of a metal rod into a box filled with ice. The end of the rod that is covered with ice becomes

  • cold. After a while Sue places her hand on the upper

end of the rod outside the box and feels that it is cold. What do you think has happened?

  • a. Cold has transferred from the lower end of the rod to

the upper end.

  • b. The rod gave up heat to the ice.
  • c. Cold moved from Sue’s hands towards the rod.
  • d. Heat moved from the rod to Sue’s hand.
  • e. It depends on the original temperature of the rod.
slide-11
SLIDE 11

ATLAST Assessing Teacher Learning About Science Teaching

(NSF Grant No. 0335328)

slide-12
SLIDE 12

Common Features of All Items

  • All are multiple choice
  • All are keyed to a specific sub-idea
  • All are set in the context of work that

teachers do

slide-13
SLIDE 13

Sample Item 

slide-14
SLIDE 14

Level 2 Item Features

  • Address teachers’ ability to analyze

student thinking using science content knowledge

  • Cannot be answered without content

knowledge

  • Only one answer choice is “content-

correct” and relevant to the instructional context

  • Fairly high cognitive load 
slide-15
SLIDE 15

Common Errors Made With Level 2 Items

  • Teachers look for common student

thinking rather than the thinking of these students 

  • Teachers look for a correct statement
  • Teachers try to answer the student item
  • Teachers look for familiar wording –

e.g., “equal and opposite”

  • Teachers need options that allow them

to hold naïve conceptions

slide-16
SLIDE 16

Sample Item 

slide-17
SLIDE 17

Level 3 Item Features

  • Address teachers’ ability to make

instructional decisions using science content knowledge

  • Cannot be answered without content

knowledge

  • Only one answer choice is “content-

correct” and relevant to the instructional context

  • High cognitive load
slide-18
SLIDE 18

Common Errors Made With Level 3 Items

  • Teachers see all activities/ questions as

“best”

– Lack of content knowledge – High cognitive load

  • Context is important

– Focus on logistics – Unfamiliar scenario/ equipment

  • Teacher beliefs 
slide-19
SLIDE 19

Types of Items

  • Knowledge of science content
  • Using science content knowledge to

analyze student thinking

  • Using content knowledge to make

instructional decisions

slide-20
SLIDE 20

Pros and Cons

  • Pros

– Rigorously developed – Strong validity – Minimally burdensome – No cost

  • Cons

– Narrowly focused(?) – Only three assessments