SSIP Evaluation Workshop 2.0: Taking the Online Series to the Next - - PowerPoint PPT Presentation

ssip evaluation
SMART_READER_LITE
LIVE PREVIEW

SSIP Evaluation Workshop 2.0: Taking the Online Series to the Next - - PowerPoint PPT Presentation

SSIP Evaluation Workshop 2.0: Taking the Online Series to the Next Level Improving Data, Improving Outcomes Pre-Conference August 14, 2018 Welcome! TA Centers DaSy ECTA IDC NCSI State Participants 2 Intended Outcomes


slide-1
SLIDE 1

SSIP Evaluation Workshop 2.0:

Improving Data, Improving Outcomes Pre-Conference August 14, 2018

Taking the Online Series to the Next Level

slide-2
SLIDE 2
  • TA Centers

– DaSy – ECTA – IDC – NCSI

  • State Participants

Welcome!

2

slide-3
SLIDE 3

Participants will

  • Increase understanding of how to conduct a high-

quality SSIP evaluation

  • Identify resources and next steps for improving SSIP

evaluation

  • Identify clear strategies for improving their

evaluation plan that will enable them to effectively evaluate SSIP improvement efforts

Intended Outcomes of Workshop

3

slide-4
SLIDE 4
  • Presentation:

– SSIP Evaluation—Pulling it all together for improvement – Data analysis strategies and plan

  • State work time
  • Concurrent presentations:

– Evaluating infrastructure – Evaluating practice change and fidelity

  • State work time
  • Wrap up

Agenda

4

slide-5
SLIDE 5

How we will Work Together

  • Today is a conversation
  • Ask questions
  • Tell us what you want to

work on

  • Tell us how we can

support you going forward

5

This Photo by Unknown Author is licensed under CC BY

slide-6
SLIDE 6

SSIP Evaluation

Data Analysis for SSIP Improvement

slide-7
SLIDE 7

Participants will

  • Increase understanding of how to use data from

multiple sources to examine SSIP progress and

  • utcomes
  • Increase understanding of strategies for analysis and

use

  • Identify strategies for developing or improving their

SSIP analysis plan

Intended Outcomes of this Session

7

slide-8
SLIDE 8

Infrastructure and Practice Implementation to Improve Results

8 Implementation of Effective Practices Good

  • utcomes

for children with disabilities and their families

Practice quality sustained

  • ver time

Increase quantity, (e.g., scaling up, more practices) Increase quality

slide-9
SLIDE 9

Evaluation Questions

9 Implementation of Effective Practices Good

  • utcomes

for children with disabilities and their families

Did SSIP activities happen as intended? Did they result in desired infrastructure improvements? Did activities to support local implementation

  • f EBPs

happen? Did they result in desired improvements

  • f practitioner’s

practices? Were intended

  • utcomes for

children/familie s achieved?

slide-10
SLIDE 10

Using Data to Improve

10 Implementation of Effective Practices Good

  • utcomes

for children with disabilities and their families Individual Practitioners Programs/local infrastructure

slide-11
SLIDE 11
  • Reporting to OSEP and some state stakeholders

– Summarize data at high level – Overall themes, findings

  • Improve SSIP activities and outcomes

– Deeper data dive – Details needed to inform decisionmaking at different system levels

SSIP Data Analysis—Purpose

11

slide-12
SLIDE 12
  • Improvement at different systems levels

– State – Local programs & schools/districts – Coaches, practitioners

  • What information do decisionmakers at different

system levels need?

  • What is the appropriate unit of analysis?

“The unit of analysis is the major entity that is being analyzed in a study. It is the 'what' or 'who' that is being studied” (Wikipedia 8-6-18)

Using Data for Decisionmaking at Different System Levels

12

slide-13
SLIDE 13

Unit of Analysis

13

State Region Program Coach Practitioner Child

slide-14
SLIDE 14
  • No single method or data source can tell you

everything

  • Examine SSIP implementation from different

perspectives (e.g., administrators, practitioners, families)

  • Mix of quantitative and quantitative data

Using Multiple Methods for a Comprehensive Evaluation Approach

14

slide-15
SLIDE 15

Evaluation Question: Are practitioners implementing the evidence-based practices with fidelity?

– Are practitioners improving implementation of the practices? – Which regions/programs are reaching high rates of practitioner fidelity? Which ones with low? – Are there specific practices that practitioners are struggling with? – What factors are helping practitioners reach fidelity? What challenges are they facing?

Example Evaluation Question

15

slide-16
SLIDE 16

Example: Data Sources for Evaluating Practice Implementation

16 Are the evidence- based practices being implemented with fidelity? Video observation Survey— Practitioner Self Report Interviews of Program Administrators Focus Groups of Practitioners

slide-17
SLIDE 17
  • Leverage data for your own purposes

– Changes over time? – Differences in cohorts? – Differences between low and high achievers (districts, schools, practitioners) – Differences between those who do and do not participate?

  • To answer your questions, you may need to

aggregate or disaggregate in different ways

Further Adventures in Data

17

slide-18
SLIDE 18
  • To address evaluation questions at different systems

levels and for different purposes

  • Different ways to aggregate (summarize) data

Data Aggregation

18

This Photo by Unknown Author is licensed under CC BY

slide-19
SLIDE 19
  • Percentage of practitioners reaching fidelity (e.g.,

statewide, in particular regions or programs)

  • Percentage of practitioners with improved score

(over 2 points in time)

  • Average change score (over 2 points in time)
  • Percentage of programs meeting a performance

indicator for practitioner fidelity

Data Aggregation Examples

19

slide-20
SLIDE 20

Data Aggregation Calculation Example

20

Example Data Summary Calculation Percentage of programs meeting performance indicator for practitioner fidelity 60% of programs had at least 75%

  • f practitioners meeting fidelity on

implementation of the Pyramid model. 1. Determine whether each practitioner met the threshold 2. Calculate the percentage of practitioners meeting the fidelity threshold for each program: # of practitioners from the program that met fidelity/total #

  • f practitioners from the program with fidelity score

3. Calculate percentage of programs where percentage of practitioners reaching fidelity is at least 75%: # of programs with at least 75% of practitioners reaching fidelity/total # of programs

slide-21
SLIDE 21
  • Digging deeper into data
  • To examine variation between subgroups and topics

Disaggregating Data

21

slide-22
SLIDE 22

School District EBP Fidelity Adams A Pyramid 85 Anderson A DEC Recommended Practices 60 Bond B Family-Guided Routine Based Intervention 70 Baker B Pyramid 80 Carver C Pyramid 75 Coolidge C DEC Recommended Practices 70 Desmond D Family-Guided Routine Based Intervention 79 Drake D DEC Recommended Practices 65 Evans E Pyramid 83 Ellington E Family-Guided Routine Based Intervention 77

Subgroup Example

22

slide-23
SLIDE 23

4 6 1 2 3 4 5 6 7 80% or above <80%

Fidelity by District

Subgroup Example: District Fidelity by Threshold

23

slide-24
SLIDE 24

73% 75% 73% 72% 80% 68% 70% 72% 74% 76% 78% 80% 82% A B C D E

Fidelity by District

Subgroup Example: Fidelity by District

24

slide-25
SLIDE 25

81% 65% 75% 0% 10% 20% 30% 40% 50% 60% 70% 80% 90% Pyramid DEC RP RBI

Fidelity by EBP type

Subgroup Example: Fidelity by EBP

25

slide-26
SLIDE 26
  • Other comparisons (e.g., different subgroups)?
  • Other ways to dig deeper into the data?

Other ways to disaggregate data?

26

slide-27
SLIDE 27
  • Differences in fidelity by district or program
  • Differences in fidelity by a particular practice/EBP
  • Differences in subgroups, e.g.:

– Schools – Practitioners

Implications of Results for SSIP Work

27

slide-28
SLIDE 28

Developing an Analysis Plan

Develop a written plan

  • Analysis strategies
  • Timeline
  • Who’s responsible
  • End products (e.g., reports,

presentations)

28

This Photo by Unknown Author is licensed under CC BY

slide-29
SLIDE 29
  • Evaluation Plan Worksheet
  • Data Analysis Worksheet

Analysis Planning Handouts

29

slide-30
SLIDE 30

Takeaways

  • Use multiple methods
  • Analysis strategies will depend
  • n purpose
  • Aggregate data for some

audiences

  • Disaggregate to dig deeper
  • Develop a written analysis plan

30

slide-31
SLIDE 31

Questions? Comments?

31

slide-32
SLIDE 32
  • Refining Your Evaluation: Data Pathway—From Source to Use
  • Strengthening SSIP Evaluations with Qualitative Methods

(DaSy)

  • Materials from the SSIP Evaluation online workshop series are

posted on the DaSy website: Evaluation of Implementation of EBP Workshop Resources and Evaluation of Infrastructure

Resources

32

slide-33
SLIDE 33

State Work Time – Table Groupings

Salon F:

  • MA, LA
  • CO, UT, AR
  • PA, ID-B
  • HI, ID-C

Salon E:

  • GA
  • IL
  • WY, FL
  • CT

33

slide-34
SLIDE 34

Wrap Up

  • Reflections
  • IDIO conference

sessions

  • Next steps
  • Session evaluation

34

slide-35
SLIDE 35
  • What struck you today?
  • What did this get you thinking about?

Reflections

35

slide-36
SLIDE 36
  • Evaluating practice implementation

Wednesday 1:30-3:00

  • Evaluating infrastructure

Wednesday 8:30-10:00

  • Evaluating professional development

Tuesday 3:00-4:30

  • Data Analysis

Wednesday, 1:30-3:00

Related Conference Sessions

36

slide-37
SLIDE 37
  • Take a few moments to reflect on next steps

(handout)

  • To request follow-up support or individualized TA

– Talk to one of us today – Contact your current TA provider

Next Steps

37

This Photo by Unknown Author is licensed under CC BY-NC

slide-38
SLIDE 38

The contents of this presentation were developed under grants from the U.S. Department of Education, # H373Z120002, #H326P120002, H326R140006, and

  • H373Y130002. However, those contents do not necessarily represent the policy of the

U.S. Department of Education, and you should not assume endorsement by the Federal Government. Project Officers: Meredith Miceli, Richelle Davis, Julia Martin Eile, Perry Williams, and Shedeh Hajghassemali. 38

Thank You!