7. Testing Testing: Big Questions How do teachers construct tests? - - PowerPoint PPT Presentation

7 testing testing big questions
SMART_READER_LITE
LIVE PREVIEW

7. Testing Testing: Big Questions How do teachers construct tests? - - PowerPoint PPT Presentation

7. Testing Testing: Big Questions How do teachers construct tests? How are teacher-made tests like/unlike standardized tests? What information comes from test results? 7.1 Instructional Objectives 7.2 Teacher-Developed


slide-1
SLIDE 1
  • 7. Testing
slide-2
SLIDE 2
slide-3
SLIDE 3

Testing: Big Questions


  • How do teachers

construct tests?


  • How are teacher-made

tests like/unlike standardized tests?


  • What information comes

from test results?

slide-4
SLIDE 4

7.1 Instructional Objectives 7.2 Teacher-Developed Tests in the Classroom 7.3 Formative Evaluation 7.4 Classroom Grading Approaches

slide-5
SLIDE 5

7.5 Criterion- Referenced Testing 7.6 Norm-Referenced Testing 7.7 Interpreting Norm- Referenced Tests Scores 7.8 Validity

slide-6
SLIDE 6

7.9 Reliability 7.10 Test Bias 7.11 Using Tests Appropriately 7.12 Summary

slide-7
SLIDE 7

7.1 Instructional Objectives

slide-8
SLIDE 8
slide-9
SLIDE 9

Objectives: Checklist for learning


  • More specific than goals

  • What students should know
  • r be able to do by end of

lesson ➔ descriptive verbs!

  • Taxonomies provide

hierarchies of increasing sophistication


  • Bloom: Cognitive, affective,

psychomotor

slide-10
SLIDE 10

Bloom’s taxonomies


  • Cognitive most used

  • 6 levels: remember,

comprehend, apply, analyze, evaluate, create

  • Objective: “Students will

compare and contrast yurts and tipis, in 3 key features.”


  • Note task, level (analysis),

criteria
 ➔ “Mastery learning” system

slide-11
SLIDE 11

7.2 Teacher-Developed Tests in the Classroom

slide-12
SLIDE 12
slide-13
SLIDE 13

Classroom assessment


Backward planning as a “best practice”


  • 1. Write objective with

taxonomy-level verb and criteria for mastery

  • 2. Create Assessment/test

that fits objective


  • 3. Plan learning activities

that support and prepare students for mastery

slide-14
SLIDE 14

Classroom tests


  • Essay: for comprehension,

analysis; needs criteria


  • Multiple choice, matching

for recognition

  • T/F, fill blanks for recall

  • Problem-solving for

application/analysis
 ➔ Consider pros/cons and kind of students who benefit

slide-15
SLIDE 15
slide-16
SLIDE 16

Performance-based or authentic assessment 1


  • Portfolio showing

progress


  • Exhibition, e.g. posters
  • Demonstration, e.g. slide

shows, videos


  • For individual or group

assessment

slide-17
SLIDE 17

Authentic assessment 2

Rubric with criteria for scoring (posted for all to see) 10 points 5 points Sources Over 5 Under 5 Facts Over 10 Under 10 Format Correct Errors Graphics Over 5 Under 5

slide-18
SLIDE 18

7.3 Formative Evaluation

slide-19
SLIDE 19
slide-20
SLIDE 20

Formative assessments 1


  • Assess/evaluate learning

needs before instruction (aka “pretest”)


  • Determine previous

knowledge on topic or skill


  • Determine readiness for

skill or topic

slide-21
SLIDE 21

Formative assessments 2


  • Check understanding,

monitor progress during learning cycle 


  • Spot errors for re-teaching
  • Give feedback and

suggestions


  • Check readiness for final

(summative) assessment (aka “posttest”)

slide-22
SLIDE 22

7.4 Classroom Grading Approaches

slide-23
SLIDE 23
slide-24
SLIDE 24

Assigning grades 1


When student gets a grade for work, what does he/she think it means? 


  • This is what I am worth
  • This how I compare with

classmates


  • This is what teacher thinks
  • f me

  • This is how well I learned
slide-25
SLIDE 25

Assigning grades


  • Letter grades: A, B, C, D, F

  • Absolute: 10 points per letter
  • Curve (relative): comparative

scaling (force bell curve?)


  • Descriptive (short or long)

  • Performance rating (with

rubric/criteria)


  • Mastery checklist (# of

attempts not important)

slide-26
SLIDE 26

7.5 Criterion-Referenced Testing

slide-27
SLIDE 27
slide-28
SLIDE 28

Criterion referencing


  • Emphasis on mastery of

specific skills/objectives


  • Good for topics that can be

broken into small objectives


  • Good for topics that have

hierarchy of skills (e.g. math)


  • Must master skill A before

you can understand and master skill B

slide-29
SLIDE 29

Criterion referencing


  • Set-up: objective and

performance criteria to prove mastery for each skill
 (e.g. 80% correct answers)


  • No comparisons (and no

time constraints?) ➔ move to next level at own pace

slide-30
SLIDE 30

7.6 Norm-Referenced Testing

slide-31
SLIDE 31
slide-32
SLIDE 32

Norm referencing


  • “Standardized”

  • Comparative with other

students

  • Achievement tests 


(what has been learned, e.g. state/graduation test)


  • Aptitude tests 


(predict future success, e.g. IQ, SAT, GRE)

slide-33
SLIDE 33

7.7 Interpreting Norm-Referenced Test Scores

slide-34
SLIDE 34
slide-35
SLIDE 35

Analyzing test results (1)


  • Raw scores ➔ derived

(comparative) score

  • “Normed” with large

samples of test-takers


  • Norming = fitted onto normal

distribution (bell curve)


  • Bell curve: mean/average

(skewed by extremes), median (middle #), and mode (most frequent) are same

slide-36
SLIDE 36

Analyzing test results (2)


Statistical descriptors


  • Areas of distribution marked

by standard deviations = deviations from average


  • Example: IQ tests 100 = avg.;

34% either side of average


  • Z-scores: # standard

deviations +/- from average


  • Stanines: #5 in center; 1-4

below, 6-9 above

slide-37
SLIDE 37

Analyzing test results (3)


More statistical descriptors


  • Percentiles = % of students

performing same or below


  • Example: 80th percentile =

performs better than 80%

  • f others

  • Grade-level equivalents = 

  • Example: 3.4 = 3rd grade,

4th month

slide-38
SLIDE 38

7.8 Validity

slide-39
SLIDE 39
slide-40
SLIDE 40

How is a test valid?


  • Validity: accuracy measure

  • Content: match what was in

curriculum

  • Face: appropriate format

  • Criterion-related: items

match objectives


  • Predictive: match future

performance


  • Construct: match other tests
slide-41
SLIDE 41

7.9 Reliability

slide-42
SLIDE 42
slide-43
SLIDE 43

How is a test reliable?


  • Reliability = consistency

  • Test-retest
  • Alternate/parallel

(versions)

  • Split-half = odds/evens
  • Kuder-Richardson = 1

test

slide-44
SLIDE 44
  • Perfect = 1.0, but .80 OK 

  • 0 = no correlation
  • Negative value = as one

factor goes up, other down

slide-45
SLIDE 45

7.10 Test Bias

slide-46
SLIDE 46
slide-47
SLIDE 47

Can a test be biased?


  • If content or format favors
  • ne SES, race, culture,

gender, or learning style


  • Shows up in form/content
  • f test question or answer
  • Partial solution: test in

students’ native language


  • Not bias: Males vary more

than females in achievement scores

slide-48
SLIDE 48

7.11 Using Tests Appropriately

slide-49
SLIDE 49
slide-50
SLIDE 50

Testing: Use wisely


  • Check validity and standard

error of estimate (score +/-)


  • Check reliability and

standard error of measurement (confidence interval) caused by degree

  • f unreliability
  • Consider how scores and

results will be used

slide-51
SLIDE 51

7.12 Summary

slide-52
SLIDE 52
slide-53
SLIDE 53

Testing the test


  • What are you trying to find
  • ut, and at what point in

learning cycle?


  • Does a test report skill

achievement or compare students?

  • Does a test measure what it

should, consistently and without bias to any learner?