A Timeline for Critical Review of DIBELS Next Goals CTL submits new - - PowerPoint PPT Presentation

a timeline for critical review of dibels next goals
SMART_READER_LITE
LIVE PREVIEW

A Timeline for Critical Review of DIBELS Next Goals CTL submits new - - PowerPoint PPT Presentation

THREE REASONS WHY THE UO RECOMMENDED DIBELS NEXT BENCHMARK GOALS ARE NECESSARY College of Education 1 2 A Timeline for Critical Review of DIBELS Next Goals CTL submits new CTL recommends DIBELS Next goals to new DIBELS Next In this session well


slide-1
SLIDE 1

College of Education

1

THREE REASONS WHY THE UO RECOMMENDED DIBELS NEXT BENCHMARK GOALS ARE NECESSARY

2

In this session we’ll answer the question: Why were the DIBELS Next recommended goals established?

4

Fall 2010 February 2011 Summer 2011 Fall 2011 Summer 2010 Spring 2012 DIBELS Next released

  • CTL only database

supporter DMG technical reports for DIBELS Next released Expert panel finds three major failings of former DIBELS Next goals CTL begins Sentinel School Project

  • Undertakes study of

DIBELS Next’s psychometric properties CTL submits DIBELS Next technical reports to peer‐review

  • External, expert

panel Panel supprts superiority of new goals

  • No major problems

noted

A Timeline for Critical Review of DIBELS Next Goals

June 2012 August 2012 Winter 2012 CTL confirms problems and begins new goal‐setting process with large nationally representative sample CTL submits new DIBELS Next goals to peer‐review

  • External panel of

practitioners and expert psychometricians

CTL recommends new DIBELS Next goals

  • Giving users a

choice of goals

slide-2
SLIDE 2

What’s wrong with the former DIBELS Next goals?

Former DIBELS Next goals …

  • 1. Vary widely, yet still miss substantial numbers of

children who need intervention (UO‐CTL, 2012c, p. 9)

– On average, the former goals miss 40% of students who may need additional, strategic support – On average, the former goals miss 56% of students who may need intensive support

  • 2. Are based on a small sample that does not represent

the diversity of U.S. children (DMG, 2011, p. 39)

  • 3. Do not use a consistent, external criterion measure to

determine risk and cut‐points (DMG, 2011, pp. 48‐49)

5

Reason 1: DMG Former Goals Miss Children Who Require Intervention

What’s wrong with the former DIBELS Next goals?

7

Assume we assess 100 students …

What’s wrong with the former DIBELS Next goals?

8

And 40% need intervention …

slide-3
SLIDE 3

What’s wrong with the former DIBELS Next goals?

9

And half of those (or 20% overall) need intensive intervention …

What’s wrong with the former DIBELS Next goals?

(University of Oregon‐CTL, 2012c, p. 9)

10

The former goals will, on average, miss 40%

  • f students needing intervention

What’s wrong with the former DIBELS Next goals?

(University of Oregon‐CTL, 2012c, p. 9)

11

And the former goals will, on average, miss 56%

  • f students needing intensive intervention

What’s right about the new DIBELS Next goals?

(University of Oregon‐CTL, 2012c, p. 9)

12

The UO‐CTL recommended goals identify 90% of students who may need support—ensuring more confidence in decision making

slide-4
SLIDE 4

Why the discrepancies?

DMG did not follow recommended, research‐based practices to create their goals The DMG former goals were created with:

  • a sample of students that was small, and not

representative of the nation

  • procedures that do not meet test and measurement

standards in the field of education (Standards for

Educational and Psychological Testing, AERA, APA, NCME, 1999; National Center on Response to Instruction)

13

Reason 2: DMG Former Goals Lack a Representative School & Student Sample

Representative Sample

Critical and required for generalizability. Representative samples should include descriptions

  • f the ethnicity and race, socioeconomic status,

gender, and geographic locations of the participants, so that the sample can be appropriately compared to other students and schools at other time points (Standards for Educational and Psychological Testing, AERA, APA, NCME, 1999).

15 50.9 19.7 14.4 4 1.1 90.7 0.7 1.2 0.7 2.1 4.7 55.1 20 16.5 2.2 4.3

White Hispanic African American American Indian Asian/Pacific Islander Other Multiracial

UO Recommended Goals DMG Former Goals All Schools in the U.S.

16

Let’s compare samples

DMG Former (DMG, 2011, p. 39); UO Recommended (UO‐CTL, 2012a, p. 2); U.S. (NCES, 2011)

slide-5
SLIDE 5

Let’s compare samples

Percent of students who qualify for Free and Reduced Lunch 63.7% UO recommended goals average

20 40 60 80 100

U.S. Average UO Recommended Goals DMG Former Goals 17

52.4% U.S. average 16.0% DMG former goals average

Reason 3: DMG Former Goals Were Developed Using an Inconsistent and Non‐ Validated Process to Determine Goals and Cut Points

Key Terms

19

as long as continued good teaching occurs. Benchmark Goal: Students above the benchmark goal have a strong likelihood of meeting end‐of‐year performance standards on an important outcome measure …. Ensuring confidence in decision making.

Key Terms

20

if intensive intervention is not provided. Cut Point for Risk: Students who score below the cut point have a strong likelihood of NOT meeting end of year performance standards on an important outcome measure . . .

slide-6
SLIDE 6

Why do schools use screening measures?

THE GOAL: To quickly determine how well students are performing and identify students at‐risk for reading difficulties or who need additional intervention.

Individual Measure Goals & Cut Points

Outcome

21

Why does your school use a screening measure?

NOT THE GOAL: To see if students will meet the DIBELS Next composite score.

Individual Measure Goals & Cut Points

Outcome

22

End of Year Composite Score

Process Used to Establish the DMG Former Goals

(DMG, 2011, pp. 48‐49)

Composite Score Step 1 Composite Score Step 2 Composite Score Step 3 Outcome

23

Global Outcome Measure Individual DIBELS Next Measures Step 4 Individual DIBELS Next Measures Individual DIBELS Next Measures

The DMG goal‐setting process did not meet recommended research‐based educational standards The Standards for Educational and Psychological Testing (AERA, APA, NCME, 1999) recommend a link between student performance on screening measures (e.g. DIBELS Next measures) and a standardized, widely used, external criterion measure (e.g. SAT10).

24

slide-7
SLIDE 7

25

DDS percentile rank associated with the beginning of DDS percentile rank associated with the beginning of DDS percentile rank associated with the beginning of Source for Percentiles: Cummings, Kennedy, Otterstedt, Baker, S.K., & Kame’enui, 2011

Variability Associated with the FORMER Goals

26

DDS percentile rank associated with the middle of DDS percentile rank associated with the end of DDS percentile rank associated with the beginning of Source for Percentiles: Cummings, Kennedy, Otterstedt, Baker, S.K., & Kame’enui, 2011

Variability Associated with the FORMER Goals

Why does this negatively impact students and schools?

  • Complicates school‐level planning and coordination
  • f intervention efforts

– The former benchmark goals vary widely across measures, grades, and times of the year.

  • Makes evaluating progress within and across school

years very unclear

– When a non‐standard linking procedure is used, the actual “value” of the former goals is inconsistent across grades and measures.

27

Recommended Linking Procedure for Establishing Goals

Goals & Cut Points Goals & Cut Points Goals & Cut Points Outcome

28

slide-8
SLIDE 8

SUMMARY

New goals were needed to assist schools in making sound educational decisions. DMG former goals are problematic because they:

1. Miss substantial numbers of children who need intervention—provide a false level of confidence 2. Are based on a sample that does not represent current U.S. public schools in terms of region, ethnicity, and SES 3. Did not use a consistent or valid process to determine goals and cut‐points

29

The UO DIBELS Data System is committed to providing teachers with the tools they need to meet the needs of all students. The UO Recommended Goals are research‐ based to support schools in making educational decisions they can have confidence in that are in the best interest of their students.

30

Thank you!

For free resources please visit: DIBELS Data System Research and Training pages https://dibels.uoregon.edu You can call us at (888) 497‐4290. We’re here to support you!

31 32

References

American Educational Research Association, American Psychological Association, & National Council on Measurement in Education. (1999). Standards for Educational and Psychological Testing. Washington, DC: American Psychological Association. Cummings, K.D., Kennedy, P.C., Otterstedt, J., Baker, S.K., & Kame’enui, E.J. (2011). DIBELS Data System: 2010‐2011 Percentile Ranks for DIBELS Next Benchmark Assessments (Technical Report 1101). Eugene, OR: University of Oregon. Dynamic Measurement Group (2011). DIBELS Next Technical Manual. Eugene, OR: Author. Joint Committee on Testing Practices (2004). Code of Fair Testing Practices in Education. Washington, DC: Author. Retrieved from http://www.theaaceonline.com/codefair.pdf National Center on Response to Intervention. (2011). Standard protocol for evaluating response to intervention tools: Screening reading and math [Screening tool evaluation protocol]. Washington, DC: Author. Pearson Education, Inc. (2007). Stanford Achievement Test—10th Edition (SAT10): Normative update. Upper Saddle River, New Jersey: Author.

slide-9
SLIDE 9

33

References

University of Oregon, Center on Teaching and Learning (2012a). 2012‐ 2013 DIBELS Data System Update Part I: DIBELS Next Composite Score (Technical Brief 1202). Retrieved from https://dibels.uoregon.edu/docs/techreports/DDS2012TechnicalBriefP art1.pdf University of Oregon, Center on Teaching and Learning (2012b). 2012‐ 2013 DIBELS Data System Update Part II: DIBELS Next Benchmark Goals (Technical Brief 1203). Eugene, OR: University of Oregon. Retrieved from https://dibels.uoregon.edu/docs/techreports/DDS2012TechnicalBriefP art2.pdf University of Oregon, Center on Teaching and Learning (2012c). DIBELS Next Recommended Benchmark Goals: Technical Supplement (Technical Report 1204). Eugene, OR: University of Oregon. Retrieved from https://dibels.uoregon.edu/docs/techreports/DDS2012TechnicalSuppl ement.pdf