Initiatives Gareth Parry, Amy Reid, Amrita Dasgupta June 28, 2016 - - PowerPoint PPT Presentation

initiatives
SMART_READER_LITE
LIVE PREVIEW

Initiatives Gareth Parry, Amy Reid, Amrita Dasgupta June 28, 2016 - - PowerPoint PPT Presentation

These presenters have nothing to disclose Designing Improvement Initiatives Gareth Parry, Amy Reid, Amrita Dasgupta June 28, 2016 2 A learning healthcare system is [one that] is designed to generate and apply the best evidence for the


slide-1
SLIDE 1

Designing Improvement Initiatives

June 28, 2016

These presenters have nothing to disclose

Gareth Parry, Amy Reid, Amrita Dasgupta

slide-2
SLIDE 2

2

A learning healthcare system is [one that] is designed to generate and apply the best evidence for the collaborative healthcare choices of each patient and provider; to drive the process of discovery as a natural

  • utgrowth of patient care; and to

ensure innovation, quality, safety, and value in health care.

slide-3
SLIDE 3

Talk to your neighbor:

What have you tried to improve this week?

slide-4
SLIDE 4

An Intervention to Decrease Catheter-Related Bloodstream Infections in the ICU

Peter Pronovost, et al December 2006

Conclusions: An evidence-based intervention resulted in a large and sustained reduction (up to 66%) in rates of catheter-related bloodstream infection that was maintained throughout the 18-month study period.

slide-5
SLIDE 5

Why Many Improvement Initiatives Are Found to “Fail”

slide-6
SLIDE 6

Conclusions: A multipayer medical home pilot, in which participating practices adopted new structural capabilities and received NCQA certification, was associated with limited improvements in quality and was not associated with reductions in utilization of hospital, emergency department, or ambulatory care services or total costs over 3 years. These findings suggest that medical home interventions may need further refinement.

(4) Friedberg, MW, et al. (2014). Association between participation in a multi-payer medical home intervention and changes in

quality, utilization, and costs of care, Journal of the American Medical Association.

(5) Urbach, DR, et al. (2014). Introduction of Surgical Safety Checklists in Ontario, Canada, New England Journal of Medicine.

6

Mark W. Friedberg et al. February 25, 2014

Conclusions: Implementation of surgical safety checklists in Ontario, Canada, was not associated with a significant reductions in operative mortality or complications.

slide-7
SLIDE 7

7

“..described in the 1980s by American program evaluator Peter Rossi as the “Iron Law” of … arguing that as a new model is implemented widely across a broad range of settings, the effect will tend toward zero.”

slide-8
SLIDE 8

Innovation to Prototyping: Small Number of Settings

Applied in a narrow range of contexts Improvement in 100% of sites

8

Parry, GJ, et al. (2013). Recommendations for Evaluation of Health Care Improvement Initiatives, Academic Pediatrics

slide-9
SLIDE 9

Initial Testing: Small Number of Settings

Applied in a wider range of contexts Improvement in 80% of sites

9 Parry GJ, et al (2013).

slide-10
SLIDE 10

More Settings as Range of Contexts Begins to Expand

Applied in a wider range of contexts Improvement in 70% of sites

10 Parry GJ, et al (2013).

slide-11
SLIDE 11

Wide Range of Contexts

Applied in a wide range of contexts Improvement in 50% of sites

11 Parry GJ, et al (2013).

slide-12
SLIDE 12

Reduction in Effectiveness from Applying Same Fixed-Protocol Program in Different Contexts

Innovation sample

12 Parry GJ, et al (2013).

slide-13
SLIDE 13

Innovation sample

Evaluation sample

Immediate wide-scale implementation

13

Reduction in Effectiveness from Applying Same Fixed-Protocol Program in Different Contexts

Parry GJ, et al (2013).

slide-14
SLIDE 14

Where Can Protocol Be Amended to Work

14

Identify contexts in which it can be amended to work as we move from Innovation to Prototype to Test and Spread

Innovation sample

Parry GJ, et al (2013).

slide-15
SLIDE 15

Core Concepts & Detailed Tasks

MEWS >=5

Use a reliable method to identify deteriorating patients in real time. When a patient is deteriorating, provide the most appropriate assessment and care as soon as possible

MEWS >=4 2 Nurses 1 Physician 1 Nurse 1 Physician 1 Physician

Action Theory

Core Concepts Detailed Tasks and Local Adaptations

slide-16
SLIDE 16

Reflection Question

How do you identify the core concepts of a new model?

slide-17
SLIDE 17

Degree of Belief

Act Evidence

17

slide-18
SLIDE 18

Degree of Belief in Change Ideas

degree of belief

Innovation

Generate/discover new models of care with evidence of improvement in a small number of settings.

Testing

Test whether a model works or can be amended to work in specific contexts.

Scale up and Spread

Implementation of models shown to apply in a broad range of contexts.

High Moderate Low

slide-19
SLIDE 19

The scientific basis of improvement

19

– What is it?

Langley et al 1997

The Model for Improvement

What are we trying to accomplish? How will we know that a change is an improvement? What change can we make that will result in improvement?

Do Study Act Plan

Deming 1900-1993

System of Profound Knowledge

Appreciation

  • f a System

Understanding Variation Psychology Theory of Knowledge

The Scientific Method Epistemology CI Lewis Plato Carl Popper Foucault Etc…

<1950s

History of Science

slide-20
SLIDE 20

Dixon-Woods, M, et al. (2011). 20

1) Generating the pressure (will) for ICUs to take part 2) A networked community 3) Re-framing BSIs as a social problem 4) Approaches that shaped a culture of commitment 5) Use of data as a disciplinary force 6) Hard edges

slide-21
SLIDE 21

From an Improvement Perspective:

Initial Concepts Concepts rather than fixed protocols are a good starting point for people to test and learn whether improvement interventions can be amended to their setting. Social Change Improvement requires social change and that people are more likely to act if they believe. Work with, rather than doing to. Context Matters Interventions need to be amended to local settings (contexts). Learning Empower those at the point of care to test, predict, fail forward and learn what is takes to bring about improvement.

slide-22
SLIDE 22

What are we learning?

The Kirkpatrick Evaluation of Learning Framework has four levels:

1.

What was the participants’ experience?

Did the participants have an excellent experience working on the improvement project? 2.

What did the participants learn?

Did they learn improvement methods and begin testing? 3.

Did they modify their behavior?

Did they work differently and see change in their process measures? 4.

Did the organization improve their performance?

Did they improve their outcomes?

slide-23
SLIDE 23

Activities of the Improvement Leaders Agents Participant Experience Level 1 Learning Level 2 Process/ Behavior Changes Level 3 Organizational, Patient-level Outcomes LEVEL 4

Content Theory:

What changes will teams make that will result in improvement? Explains how we predict that the change concepts and improvement drivers applied in the project will lead to improved outcomes.

Execution Theory:

What will the improvement initiative do that will lead teams to adopt the process changes?

Explains what improvement leaders or agents will do that will lead front-line teams to adopt the changes described in the content theory.

Parry et al Recommendations for Evaluation of Health Care Improvement Initiatives 2013 Acad Peds

slide-24
SLIDE 24

Five Core Components

24

Core Component 1) Goals Aim Statement 2) Content Theory Driver Diagram or Change Package 3) Execution Theory Logic Model 4) Data Measurement & Learning Measurement Plan 5) Dissemination Dissemination Plan The results and learning derived from the evaluation of an improvement initiative can be clearly communicated.

That will maximize the chances that

slide-25
SLIDE 25

Five Core Components

Amritia Dasgupta

slide-26
SLIDE 26

Core Component 1) Goals Aim Statement 2) Content Theory Driver Diagram or Change Package 3) Execution Theory Logic Model 4) Data Measurement & Learning Measurement Plan 5) Dissemination Dissemination Plan

slide-27
SLIDE 27

Goals: Aim Statement

slide-28
SLIDE 28

28

Hope is not a plan. Some is not a number. Soon is not a time.

  • Don Berwick, MD
slide-29
SLIDE 29

Goals & Aims at Multiple Levels

Aspirational Aim: Stretch goal used mainly to inspire those engaged in the improvement work. Achievable Aim: Measurable target believed to be achievable during a project’s timeframe, captured in aim statements.

29

slide-30
SLIDE 30

Aim Statement

How much, by when, for whom? An aim statement describes what we expect to achieve in the timeframe of the project, taking the form of “how much, by when, for whom”. The system or bounds of the project are also defined.

30

slide-31
SLIDE 31

Building an Aim

Pre-Work

Protect time to develop an attainable and informed aim Review what has been achieved in the past in similar work and settings Consider voices needed to set the aim and build buy-in

Creating the Aim

Understand the current state in your system, answer a need in the community

Ongoing

Check progress as you go and refocus aim as needed

31

slide-32
SLIDE 32

Progress Scale

32

1) Program Defined May 2015

Teams formed; target population identified; aims determined; Driver Diagram & Change Package defined; measurement strategy defined; project charter agreed; Extranet created; teams engaged in data collection (at least 90% of teams reporting on natural birth rate indicator and 40% of teams reporting on at least 3 indicators).

2) Activity but No Changes in Practice July 2015

Teams are actively engaged in data collection (100% of teams reporting on natural birth rate indicator and at least 70% of teams reporting on at least 3 indicators). Site visits made by HIAE to all hospitals. 90% of teams have run at least 1 PDSA cycle and completed 2 monthly reports.

3) Modest Improvement Oct 2015

80% of teams show evidence of moderate improvement (wherever teams are starting, they are 20% of the way to the aim of at least 40% natural births or have sustained a rate of at least 40% natural births). 80% of teams have run at least 2 PDSA cycles and completed 4 monthly reports. 60% of teams have tested at least 1 change within each primary driver. 40% of teams are demonstrating moderate improvement in reduction of NICU per capita costs indicator and 30% of teams are demonstrating moderate improvement in reduction of adverse events indicator.

4) Significant Progress March 2016

80% of teams show evidence of significant improvement (wherever teams are starting, they are 70% of the way to the aim of at least 40% natural births or have sustained a rate of at least 40% natural births for at least 2 months). At least 70% of teams show evidence of moderate/significant improvement in reduction of NICU per capita costs indicator and adverse events indicator.

5) Outstanding Success Sep 2016

90% of teams show evidence of outstanding improvement (wherever teams are starting, they have achieved a 40% natural births rate or have sustained a rate of at least 40% natural births for at least 4 months).

slide-33
SLIDE 33

Example

By November 2016, The “Parto Adequado” Collaborative aims to achieve a rate of at least 40% natural births within a pilot group of 21 private hospitals and 4 public hospitals in Brazil whose C- section delivery rates are 75% or higher.

33

slide-34
SLIDE 34

Reflection Question

Think about an improvement project you’re involved with: What might be some research implications of designing an improvement project using an aspirational aim vs. an achievable aim?

34

slide-35
SLIDE 35

Content Theory

slide-36
SLIDE 36

Content Theory (The What)

What changes will be made that result in improved outcomes?

Content theory describes the processes or behaviors that, if adopted, we predict will improve patient

  • utcomes.

A driver diagram is a visualization of this shared theory, depicting areas in the system that improvement teams can modify to drive improvement.

36

slide-37
SLIDE 37

37

Content Theory:

What changes will teams make that will result in improved outcomes?

Parry et al. Recommendations for Evaluation of Health Care Improvement Initiatives, 2013, Acad Peds.

Explains how changes in processes will improve

  • rganizational performance or patient outcomes.

Activities of the Improvement Leaders Agents Participant Experience Level 1 Learning Level 2 Process/ Behavior Changes Level 3 Organizational, Patient-level Outcomes LEVEL 4

slide-38
SLIDE 38
slide-39
SLIDE 39

Parto Adequado Driver Diagram

slide-40
SLIDE 40

Reflection Question

Talk to your neighbor: For improvement projects, we start with an initial content theory. How would you go about updating your theory over time?

slide-41
SLIDE 41

Execution Theory

slide-42
SLIDE 42

Execution Theory (The How)

What will the improvement initiative do that will lead teams to adopt process changes? Execution theory describes the rationale for how the experience provided by the improvement initiative, the improvement methods taught and

  • ther activities delivered, and the learning applied

leads to improvement in the process or outcome measures

slide-43
SLIDE 43

43

Execution Theory:

What will the improvement initiative do that will lead teams to adopt the process changes?

Parry et al. Recommendations for Evaluation of Health Care Improvement Initiatives, 2013, Acad Peds.

Explains how a program’s design will enable improvement teams to achieve desired changes.

Activities of the Improvement Leaders Agents Participant Experience Level 1 Learning Level 2 Process/ Behavior Changes Level 3 Organizational, Patient-level Outcomes LEVEL 4

slide-44
SLIDE 44

Why describe your execution theory?

Clarifies theory and strategy Gets everyone on the same page Increases intentionality and purpose, sets priorities Identifies measures that matter to us Helps identify standard work Allows for comparison across programs Many funders require this

44

slide-45
SLIDE 45

Using Logic Models

45 Source: WK Kellogg FouSource: WK Kellogg Foundation, Logic Model Development Guide

Execution Theory Content Theory

slide-46
SLIDE 46

Parto Adequado Collaborative – PPA May 2015 – November 2016

Activities

What are you doing? (e.g. training, coaching, expert meeting) Teams: Attend LS and Webex calls Upload data and monthly report Plan, test implement and report changes Inform results, success and barriers Steering committee meetings to plan and execute and assess progress Develop driver diagram, change package, measurement strategy, logic model, dissemination plan Site visits

Inputs

What resources will be used to support the project? IHI Staff: senior leader, director IA Extranet, Webex Change package, mapping processes, measures, DD Learning sessions Clinical Experts HIAE: Senior Sponsor Clinical Director Manage logistics Interact with local media Finance the Collaborative Electronic questionnaire Clinical Training for the health professionals Site visits for the 28 hospitals Test the innovations and changes in advance ANS Senior Sponsor ANS Website Experts in the field Support the Project as the regulatory agency for the private sector

Mid-Term Outcomes

Increase in providers’ engagement of patients & families Teams using QI methods to improve processes of maternal care Raise awareness in the society about the risks of an unnecessary C- section

Teams are engaged in collecting, analyzing & interpreting data to support QI Did behavior and/or process measure change?

IF…. THEN IF THEN

Short-Term Outcomes

What changes in attitude, knowledge, skill will be needed to move forward? Identify system barriers from patient perspective

Improved team work and communication among them and

  • ther hospitals

Ability to identify & segment target patient population Build skill in using MFI and measurement Providers apply best- practice in maternal care

Long-Term Outcomes

Did the outcome improve?

Improve experience

  • f care – natural

birth as a positive and desirable experience Hospital teams comfortable using the MFI in all areas Hospitals actively working on safety and quality In maternal care reducing morbidity for mothers and babies Increase the percentage of natural birth in a safe way near to what WHO recommends

Teams agreed with the change package and set priorities (teste and implementation) Culture of excitement about improvement among participants ANS Select and invite hospitals Discuss the regulatory environment with the stakeholders Convene stakeholders to discuss DD, change package and measures

Outputs

DD, change package, measurement strategy document agreed by stakeholders Outcome and processes measures from all teams 28 hospitals trained in the MFI 5 LSs 17 Webex calls 28 hospitals visited by HIAE to instruct about adequate infrastructure to assist natural birth Newsletters and reports

What is your reach and what are products of the activities? (e.g. 20 leaders trained on X topic)

Contextual and External Factors: Brazil has the highest C-section rate in the planet. In the last decade the C-section rate increased despite the efforts of ANS, the regulatory Agency for the private sector: published rules and recommendations – no effect!!!!. Before 2012 no demonstrations to reduce CS rates private sector was acknowledged. First Pilot 2012 – Unimed Jaboticabal from 0% to 40% NB in 9 months using MFI. 3 more cities with same results. Public prosecutor sued ANS. ANS ask for IHI help. Obstetrician don’t see the high C-section rate as a problem.

slide-47
SLIDE 47

Reflection Question

Talk to your neighbor: What activities do you need to get people to implement process changes?

47

slide-48
SLIDE 48

Five Core Components

48

Core Component 1) Goals Aim Statement 2) Content Theory Driver Diagram or Change Package 3) Execution Theory Logic Model 4) Data, Measurement & Learning Measurement Plan 5) Dissemination Dissemination Plan

slide-49
SLIDE 49

Data, Measurement & Learning

You cannot fatten a pig by weighing it.

slide-50
SLIDE 50

Data, Measurement & Learning

How will we know that a change is an improvement?

slide-51
SLIDE 51
  • 1. Do we have measures for all goals?
  • 2. When and how will various data be collected, including

quantitative and qualitative data?

  • 3. How often will data be analyzed? By what methods?
  • 4. How will we use data to inform course corrections?
  • 5. What do we want to learn over time?

Data, Measurement & Learning

slide-52
SLIDE 52
  • 1. Do we have measures for all goals?
  • 2. When and how will various data be collected, including

quantitative and qualitative data?

  • 3. How often will data be analyzed? By what methods?
  • 4. How will we use data to inform course corrections?
  • 5. What do we want to learn over time?

Data, Measurement & Learning

slide-53
SLIDE 53

Kirkpatrick Framework

ADD CITATION FROM KIRK PAPER

53

  • 1. Experience

What was the participants’ experience? Did the participants have an excellent experience working

  • n the improvement project?
  • 2. Learning

What did participants learn? Did they learn improvement methods and begin testing?

  • 3. Process/Behavior

Did participants modify their behavior? Did they work differently and see change in their process measures?

  • 4. Outcomes

Did the organization improve its performance (via

  • utcome measures)?
slide-54
SLIDE 54

Kirkpatrick Framework

ADD CITATION FROM KIRK PAPER

54

  • 1. Experience
  • Survey question: Overall, I had an excellent

experience at this learning session/webinar/event.

  • Open-ended: Tell us more about your experience at

this learning session/webinar/event.

  • 2. Learning
  • As a result of this learning session, I feel more

confident in my ability to _________

  • Observe application of skills
  • 3. Process/Behavior
  • Run/SPC charts of process measures
  • Qualitative interviews asking what enables/hinders

process improvement

  • 4. Outcomes
  • Run/SPC charts of outcome measures
  • Qualitative interviews asking about other factors

impacting their ability to see desired outcomes

slide-55
SLIDE 55
  • 1. Do we have measures for all goals?
  • 2. When and how will various data be collected,

including quantitative and qualitative data?

  • 3. How often will data be analyzed? By what methods?
  • 4. How will we use data to inform course corrections?
  • 5. What do we want to learn over time?

Data, Measurement & Learning

slide-56
SLIDE 56
  • Are data already collected on a timeframe around which

you can be flexible? Can you use existing tools to collect data, or do you have to create your own?

  • Depends on the measure, at least monthly collection of

quantitative data.

  • Quarterly collection of qualitative data, collection by

phase of the project, collection where you’re already coming together

Data, Measurement & Learning

slide-57
SLIDE 57
  • 1. Do we have measures for all goals?
  • 2. When and how will various data be collected, including

quantitative and qualitative data?

  • 3. How often will data be analyzed? By what

methods?

  • 4. How will we use data to inform course corrections?
  • 5. What do we want to learn over time?

Data, Measurement & Learning

slide-58
SLIDE 58

How often will data be analyzed? By what methods? Monthly analysis using run chart or SPC chart rules

Data, Measurement & Learning

slide-59
SLIDE 59

Parto Adequado Outcome Measure

0% 5% 10% 15% 20% 25% 30% 35% 40%

Percent of Natural Births

Date

Percent of Natural Births in 25 Hospitals (Pilot)

After Start of Collaborative Goal

Increase in the median rate of natural births from approximately 22% in early 2015 to approximately 33% by March 2016, within the pilot population in the 25 pilot hospitals.

slide-60
SLIDE 60
  • NICU Cost measure
  • Adverse Events measure
  • Maternal Satisfaction measure

Learning Discussion Topics Measures for all goals? Role of leadership? Contextual factors like Zika virus Changing culture of convenience

60

Parto Adequado

slide-61
SLIDE 61
  • 1. Do we have measures for all goals?
  • 2. When and how will various data be collected, including

quantitative and qualitative data?

  • 3. How often will data be analyzed? By what methods?
  • 4. How will we use data to inform course

corrections?

  • 5. What do we want to learn over time?

Data, Measurement & Learning

slide-62
SLIDE 62

Reflection Question What areas of improvement do you have in your data, measurement and/or learning in your QI initiative?

62

slide-63
SLIDE 63

Five Core Components

63

Core Component 1) Goals Aim Statement 2) Content Theory Driver Diagram or Change Package 3) Execution Theory Logic Model 4) Data, Measurement & Learning Measurement Plan 5) Dissemination Dissemination Plan

slide-64
SLIDE 64

Dissemination Plan

“It depends. If I am to speak ten minutes, I need a week for preparation; if fifteen minutes, three days; if half an hour, two days; if an hour, I am ready now.”

slide-65
SLIDE 65

Dissemination Plan Design

  • The message
  • Target audience(s)
  • Outputs(s) for reaching audience
  • Initial work plan
  • Resourcing
slide-66
SLIDE 66

Dissemination Plan

66

slide-67
SLIDE 67

Dissemination Plan Design

  • Supports scale-up & spread
  • Adds knowledge to the field
  • Helps team better understand

their learning

slide-68
SLIDE 68

SQUIRE: Standards for Quality Improvement Reporting Excellence

68

Ogrinc, Greg, et al. "SQUIRE 2.0 (Standards for QUality Improvement Reporting Excellence): revised publication guidelines from a detailed consensus process.". BMJ Qual Saf doi:10.1136/bmjqs-2015-004411

http://www.squire-statement.org/

slide-69
SLIDE 69

Reflection Question How do you plan to communicate results and learning from your improvement work?

slide-70
SLIDE 70

A Learning system to reflect on progress

slide-71
SLIDE 71

Project Progress Scale: On Track

1) Program Defined May 2015

Teams formed; target population identified; aims determined; Driver Diagram & Change Package defined; measurement strategy defined; project charter agreed; Extranet created; teams engaged in data collection (at least 90% of teams reporting on natural birth rate indicator and 40% of teams reporting on at least 3 indicators).

2) Activity but No Changes in Practice July 2015

Teams are actively engaged in data collection (100% of teams reporting on natural birth rate indicator and at least 70% of teams reporting on at least 3 indicators). Site visits made by HIAE to all hospitals. 90% of teams have run at least 1 PDSA cycle and completed 2 monthly reports.

3) Modest Improvement Oct 2015

80% of teams show evidence of moderate improvement (wherever teams are starting, they are 20% of the way to the aim of at least 40% natural births or have sustained a rate of at least 40% natural births). 80% of teams have run at least 2 PDSA cycles and completed 4 monthly reports. 60% of teams have tested at least 1 change within each primary driver. 40% of teams are demonstrating moderate improvement in reduction of NICU per capita costs indicator and 30%

  • f teams are demonstrating moderate improvement in reduction of adverse events indicator.

4) Significant Progress March 2016

80% of teams show evidence of significant improvement (wherever teams are starting, they are 70% of the way to the aim of at least 40% natural births or have sustained a rate of at least 40% natural births for at least 2 months). At least 70% of teams show evidence of moderate/significant improvement in reduction of NICU per capita costs indicator and adverse events indicator.

5) Outstanding Success Sep 2016

90% of teams show evidence of outstanding improvement (wherever teams are starting, they have achieved a 40% natural births rate or have sustained a rate

  • f at least 40% natural births for at least 4 months).
slide-72
SLIDE 72

Europe (Patient Safety): STOP HAI Collaborative, Portugal

Aim: In 12 prototype hospitals, the aim of STOP HAI is to halve the rate of hospital-acquired infections in prototype units by July 2018.

slide-73
SLIDE 73

Rapid-cycle Evaluation System

Design

Clarify core components

Improvement

Study and Amend core components

Close-out

Finalize new program theory

Improvement activities begin Improvement activities end Review A Review B Review C

INPUTS

Program Theory Goals, Measures Evaluation Plan

  • Dissem. Plan

OUTPUTS

Revised Program Theory Dissemination

73

slide-74
SLIDE 74

Milestone Calls at IHI

Who: Project Director, Project Manager, Project Coordinator, Faculty, Senior Sponsor, Improvement Advisor, Evaluation Associate, Regional Lead, Focus Area Lead What: 90-minute call Where: In-person and virtual attendees in IHI office When: Quarterly and aligned with project milestones Why: To pause and reflect on progress, data, and learning to make amendments to theory and support ongoing

  • redesign. To zoom out.

74

slide-75
SLIDE 75

Reflection

What information will help you to adapt an improvement initiative as it progresses?