D ATA B ASED P ROBLEM S OLVING Don Kincaid University of South - - PowerPoint PPT Presentation

d ata b ased p roblem s olving
SMART_READER_LITE
LIVE PREVIEW

D ATA B ASED P ROBLEM S OLVING Don Kincaid University of South - - PowerPoint PPT Presentation

D ATA B ASED P ROBLEM S OLVING Don Kincaid University of South Florida 1. MTSS Defined MTSS is a term used to describe an evidence- based framework of educating students that includes providing high quality, effective core instruction,


slide-1
SLIDE 1

DATA BASED PROBLEM SOLVING

Don Kincaid University of South Florida

slide-2
SLIDE 2
  • 1. MTSS Defined

MTSS is a term used to describe an evidence- based framework of educating students that includes providing high quality, effective core instruction, intervention supports matched to student needs and uses data based problem solving to integrate all academic and behavior instruction and interventions.

slide-3
SLIDE 3
  • 2. Data Utilization – In General
  • Educators should use key questions to guide

data use (Feldman & Tung, 2001; Lachat & Smith, 2005; Protheroe, 2001)

  • Structured data use approaches: use data

rather than be used by data (Wayman & Stringfield, 2006)

slide-4
SLIDE 4
slide-5
SLIDE 5
  • 2. Data Utilization – In General
  • Recognize & plan for common barriers to data

use (Coburn & Talbert, 2006; Honig & Venkateswaren, 2012; Kerr et al., 2006; Lachat & Smith, 2005; Little, 2012; Young, 2006)

  • Expand definition of a “data system” beyond

just technology – include data practices/culture!! (e.g., Armstrong & Anthes, 2006; Honig &

Venkateswaren, 2012; Ingram et al., 2004).

slide-6
SLIDE 6
  • 3. Foundation of DBPS:

Using Data for Evaluation

  • Data Evaluation:
  • Process is in place for making informed

decisions

  • Process includes problem identification,

problem analysis, intervention implementation and evaluation

  • 4-step problem-solving process (RtI)
  • Informs your Action Plan steps
  • continue interventions/practices
  • adapt, revise, modify interventions/practices
  • discontinue interventions/practices
slide-7
SLIDE 7

Evaluating Your Data

  • Data Evaluation:
  • Process is in place for making informed

decisions

  • Process includes problem identification,

problem analysis, intervention implementation and evaluation

  • 4-step problem-solving process (RtI)
  • Informs your Action Plan steps
  • continue interventions/practices
  • adapt, revise, modify interventions/practices
  • discontinue interventions/practices
slide-8
SLIDE 8

Evaluating Your Data

  • Evaluation Process Includes:
  • System to efficiently and effectively collect, record and graph

data

  • Resources and expertise to review and analyze data
  • Monthly review and analysis of discipline and outcome data
  • SWPBS Action Plan updates based on data review and

analysis

  • Discussion:
  • Are these steps included in your school’s data evaluation process?
  • If not, in what areas would you like additional support?
slide-9
SLIDE 9

Discipline Data Sources

  • Five major data sources:
  • Average referrals/day/month
  • Referrals by: problem behavior, location, time of day, and individual student
  • Additional data sources
  • Referrals by motivation or function (get/obtain, escape/avoid)
  • Office-managed vs. classroom-managed referrals
  • ISS/OSS data
  • Discussion:
  • Does your PBS team review and analyze your school’s discipline data

at each meeting?

  • Does your team use the data to evaluate the PBS development and

implementation process and develop next steps?

slide-10
SLIDE 10

Other Data Sources

  • Staff, student and/or parent surveys
  • Staff and student attendance
  • Teacher requests for assistance or school-wide behavioral screening
  • ESE referrals
  • Grades and/or standardized test scores (FCAT)
  • Fidelity measures
  • Benchmarks of Quality, PBS Implementation Checklist, Walkthrough Evaluations
  • SWPBS Action Plan
  • Direct observations
  • Discussion:
  • What are other sources of outcome data?
  • Does your PBS team review other data sources at each meeting and

use the data to evaluate progress?

slide-11
SLIDE 11

Data Evaluation

  • Questions to address at each monthly PBS

meeting:

  • Are problem behaviors improving?
  • Are problem behaviors ‘holding steady’?
  • Are problem behaviors ‘getting worse’?
  • The following slides provide ‘next steps’ to

help address each of these questions.

slide-12
SLIDE 12

Problem Behaviors Improving

  • Discipline data shows a decrease in problem behavior
  • At least 80% of students receive 0-1 ODRs
  • Significant decrease in ODRs from previous month/quarter
  • Decrease in OSS/ISS days
  • Review other data sources to confirm progress
  • At least 80% of students contact reward events
  • PBS Implementation Checklist/Benchmarks of Quality
  • Consistency exists across teachers, grade-levels/hallways, etc.
  • School-climate/faculty surveys more positive or supportive
  • ODRs are decreasing equally - disaggregate the data
  • ESE, ethnicity/race, free/reduced lunch, male/female
  • Classroom, grade-level, individual teachers
slide-13
SLIDE 13

Problem Behaviors ‘Holding Steady’

  • Look for areas of improvement
  • Benchmarks of Quality, PIC, Action Plan implementation
  • Increasing the level of support at Tier 1 may increase intervention effectiveness
  • Are your interventions targeted appropriately?
  • Review referrals by location, time of day, teacher, grad-level, etc.
  • Review expectations and rules
  • Are the expectations well-defined and have they been taught?
  • Review discipline procedures and definitions
  • Are problem behaviors well-defined?
  • Are office-managed vs. teacher-managed behaviors well-defined?
  • Do your interventions target the appropriate

function/motivation of the problem behaviors?

slide-14
SLIDE 14

Problem Behaviors ‘Getting Worse’

  • Use the 4-step problem solving

process:

  • 1. Identify the Problem

Be specific, problem behavior(s) should be well-defined

  • 2. Analyze the Problem – Hypothesis development

Teaching – Are the expectations being taught as planned? Fidelity – Are the interventions being implemented as designed? Admin decisions & function of behavior: Is problem behavior being reinforced?

  • 3. Design Interventions

Do the interventions target the problem behavior(s)? Have the strategies been taught to all staff?

  • 4. Evaluation (RtI) – Is it working?

Are the problem behaviors decreasing?

slide-15
SLIDE 15

Problem Behaviors ‘Getting Worse’

Problem-Solving Process

Step 1: Problem Identification Step 2: Problem Analysis Step 3: Intervention Design Step 4: Response to Intervention

Why is it occurring? What’s the problem? What are we going to do about it? Is it working?

slide-16
SLIDE 16

What is the Problem? Why is it

  • ccurring?

Is it working? What are we going to do about it?

The PBIS Triangle: Another View

I II III

Problem Identification Problem Analysis Intervention Design Response to Intervention

slide-17
SLIDE 17

Florida’s Guiding Questions

Step p 1 – Problem ID

  • What do we expect out students to know, understand, and do as a result of instruction?
  • Do our students meet or exceed these expected levels? (How sufficient is the core?)
  • Are there groups for whom core is not sufficient?

Step p 2 – Problem Analysi ysis

  • If the core is NOT sufficient for either a “domain” or group of students, what barriers have
  • r could preclude students from reaching expected levels?

Step p 3 – Plan Developm

  • pment

ent and Imp mpleme ment ntati tion

  • What strategies or interventions will be used?
  • What resources are needed to support implementation of the plan?
  • How will sufficiency and effectiveness of core be monitored overtime?
  • How will fidelity be monitored over time?
  • How will “good”, “questionable,” and “poor” responses to intervention be defined?

Step p 4 – Plan Eva valua uati tion n of Effecti ctivenes eness

  • Have planned improvements to core been effective?
slide-18
SLIDE 18

Step 1: Problem Identification – Tier 1

  • What do we expect our students to know,

understand, and do as a result of instruction?

  • Do our students meet or exceed these

expected levels? (How sufficient is the core?)

  • Are there groups for whom core is not

sufficient?

slide-19
SLIDE 19

Expectations for Behavior

  • 80% have 1 or fewer ODRs
  • Are the # of ODRs, ISS and OSS per 100

students higher than the national or district average?

  • Are the # of ODRs, ISS and OSS per

100 students decreasing?

  • Is attendance steady?
slide-20
SLIDE 20

Step 1: Problem Identification – Tier 1

  • What do we expect our students to know,

understand, and do as a result of instruction?

  • Do our students meet or exceed these

expected levels? (How sufficient is the core?)

  • Are there groups for whom core is not

sufficient?

slide-21
SLIDE 21

Do 80% of students exhibit appropriate behavior?

slide-22
SLIDE 22

Do 80% of students exhibit appropriate behavior?

slide-23
SLIDE 23

During the current year, does the school have students with 2 or more ODRs by October 1?

slide-24
SLIDE 24

Are the # of ODRs, ISS and OSS per 100 students higher than the national or district average?

  • National Average for MS is .05 per 100

students

slide-25
SLIDE 25

Are the # of ODRs, ISS and OSS per 100 students decreasing?

slide-26
SLIDE 26

Are the # of ODRs, ISS and OSS per 100 students decreasing?

slide-27
SLIDE 27

Is attendance steady?

slide-28
SLIDE 28

Step 1: Problem Identification – Tier 1

  • What do we expect our students to know,

understand, and do as a result of instruction?

  • Do our students meet or exceed these

expected levels? (How sufficient is the core?)

  • Are there groups for whom core is not

sufficient?

slide-29
SLIDE 29

Are there groups of students for whom the Tier 1 Core is not sufficient?

slide-30
SLIDE 30

Are there groups of students for whom the Tier 1 Core is not sufficient?

slide-31
SLIDE 31

Are there groups of students for whom the Tier 1 Core is not sufficient?

slide-32
SLIDE 32

Problem Identification - Example

slide-33
SLIDE 33

Step 2 – Problem Analysis Tier 1

  • If the core is NOT sufficient for either a

“domain” or group of students, what barriers have or could preclude students from reaching expected levels?

– Why are some students not successful (Initial Hypotheses)?

slide-34
SLIDE 34

Step 2: Problem Analysis – Tier 1

1.

  • 1. Instr

structi uction n

  • Are best

t practi ctices ces in inst structio ruction n being delivere red d to those se stud udents nts?

  • Is inst

structio ruction n being delivere ered d in suffici cient nt amounts nts or as often en as necess ssary? y?

  • 2. Curric

rriculum ulum

  • Are lesson

son plans ns in alignment nment with h the appropri

  • priat

ate core stan anda dards ds/e /exp xpectations? ctations?

  • Are the curric

ricula ular r material rials s being used d with th fide delity ty or as desi signe ned?

  • Does staff have the knowledge

dge and skills s to uti tilize ze the curricu ricular ar materi rials in alignment nment with th grade-level/scho /school-wid wide standar ndards ds or expecta ctati tions ns?

  • 3. Envi

vironm

  • nment

ent

  • Do al

all staff f and stude udent nts s know the school

  • l-wid

wide e behaviora

  • ral expecta

ctati tions ns?

  • Are they

y being used d consi sist stent ntly y across ss all set etti tings? s? (e.g., school

  • l climate)?
  • Are the school-wid

wide behavioral ral expectati tations ns in alignment nment with th the school

  • l/di

/dist strict rict miss ssions?

  • ns?
  • Are best

t practi ctices ces in classr sroom m management ent being utilized ed and in alignment nment with th the school

  • l-wid

wide e behaviora

  • ral expecta

ctati tions ns?

  • 4. Learner

er

  • Are students

udents access ssing ng the ava vailable inst structio ruction? n? (e.g., attendanc ndance)

  • Are students “actively engaged” in classroom instruction?
  • Do s

stud udents nts perceiv eive having ng a p positive relati tions nshi hip with th their r school

  • l/t

/teache chers?

slide-35
SLIDE 35

Problem Analysis - Example

slide-36
SLIDE 36

Step 3: Plan Devel. and Implementation–Tier 1

  • What strategies or interventions will be

used?

  • What resources are needed to support

implementation of the plan?

  • How will sufficiency and effectiveness of

core be monitored overtime?

  • How will fidelity be monitored over time?
  • How will “good”, “questionable,” and “poor”

responses to intervention be defined?

slide-37
SLIDE 37
  • 3. Intervention Design - Example
slide-38
SLIDE 38

Step 4: Plan Evaluation– Tier 1

  • Have planned improvements to core been

effective?

slide-39
SLIDE 39

Are the # of ODRs, ISS and OSS per 100 students higher than the national or district average?

  • National Average for MS is .05 per 100

students

Intervention in August produced immediate and sustained change.

slide-40
SLIDE 40

Are the # of ODRs, ISS and OSS per 100 students decreasing?

Over 50% reduction in two years.

slide-41
SLIDE 41

Are the # of ODRs, ISS and OSS per 100 students decreasing?

Implementation produced immediate and sustained change.

slide-42
SLIDE 42

Are there groups of students for whom the Tier 1 Core is not sufficient?

Do these bar graphs level out indicating no disproportionality?

slide-43
SLIDE 43

Fidelity?

Does the school see improvements next year in these area?

slide-44
SLIDE 44

Response to Intervention - Example

slide-45
SLIDE 45

Evaluation: Additional Guiding Questions

  • Is PBS/RtI:B being implemented across

campus?

  • Is it being implemented with fidelity?
  • Is there sustainability of implementation?
  • Are there benefits to students over time with

PBS/RtI:B implementation?

  • Are there benefits for staff?
  • Do students with greater needs benefit from

implementation?

slide-46
SLIDE 46

Guiding Questions: Tiers 2 and 3

  • Do any students meet or exceed expected levels? (How sufficient is

the core?)

– Is there any disproportionality (race, ethnicity, sex, ESE, grade level, class distribution, etc.) in academic/behavior outcomes? – Are no more than approximately 20% identified as needing additional supplemental supports (i.e., Tier 2)? If no, does the Tier 1 improvement plan address this? – Are no more than approximately 5% of students identified as needing intensive supports (i.e., Tier 3)? If no, does the Tier 1 improvement plan address this?

  • Are there groups for whom Tier 2 and Tier 3 services currently being provided are

not sufficient? – Are there any students who are represented in multiple groups (e.g., demonstrate needs in behavior and academic domains)? – Has the team considered the function and/or type of the problem?

slide-47
SLIDE 47

Guiding Questions: Tiers 2 and 3

Since the core and/or supplemental instruction is NOT sufficient for either a group

  • f students or an individual student, what

barriers have or could preclude students from reaching expected levels?

– Are hypotheses focused on alterable factors? – Is there a clear understanding of the situations (i.e., antecedents) that result in the outcomes being achieved for the group/student who is not meeting expectations?

slide-48
SLIDE 48

Guiding Questions: Tiers 2 and 3

Step ep 3 – Plan Developme elopment nt and d Impl mpleme ementat ntation ion

  • What strategies or interventions will be used?

– Matching intervention to function – Limited number of generic approaches

  • What resources are needed to support implementation of the plan?

– T2=quick turn around, limited teacher training, progress monitoring, etc., T3= team facilitation, behavioral expertise, etc.)

  • How will sufficiency and effectiveness of Tier 2 supports be monitored
  • vertime?

– Introduction of progress monitor tool consistent across all interventions – Impact of Tier 2 and 3 interventions on Core outcome measures (ISS, OSS, ODRs)

  • How will fidelity be monitored over time?

– Usefulness of PIC and BAT for fidelity

  • How will “good”, “questionable,” and “poor” responses to intervention

be defined? – Goal level and criteria for attainment of goal developed by the Tier 2/3 team

slide-49
SLIDE 49

Guiding Questions: Tiers 2 and 3

  • Have planned improvements/supports at Tier 2 been

effective? – Does the team have a set of guidelines to structure a common approach to analyzing the data (e.g., “decision rules”)? – If students’ progress in response to Tier 2 or Tier 3 services demonstrates a “good” response, and there is no increase in Tier 1 performance, what decision(s) will the team make? – If students’ progress in response to Tier 2 or 3 services demonstrates “questionable” or “poor” responses, is there adequate fidelity of intervention services being provided? If yes, or no, what decisions will the team make?

slide-50
SLIDE 50

Intervention Outcomes

slide-51
SLIDE 51

Implementation Fidelity

slide-52
SLIDE 52

Guiding Questions: Tiers 2 and 3

Step ep 4 – Pl Plan an Ev Eval alua uation tion of Ef Effect ectiv ivene eness ss

  • Have planned improvements to Tier 2 been

effective?

– Decreases in “core” behavioral issues (OSS,ISS, ODRs)

  • Per student
  • Per groups of students/interventions
  • Per entire school/grade/classroom

– Improvement in progress monitoring tool’s goal

slide-53
SLIDE 53

Data for T2 and 3

slide-54
SLIDE 54
slide-55
SLIDE 55
slide-56
SLIDE 56
slide-57
SLIDE 57

Use Your Data

  • Identify areas that are problematic
  • Improve Tier 1 supports
  • Expand and implement PBIS strategies
  • Tier 2/Supplemental
  • Tier 3/Intensive
  • Support the acquisition of additional resources for

further school improvement

  • Promote PBIS within the community
  • Identify and celebrate successes
slide-58
SLIDE 58

Thank You!

  • Questions/Comments?