Just Culture Recognizing and Reporting Errors, Near Misses and - - PowerPoint PPT Presentation

just culture
SMART_READER_LITE
LIVE PREVIEW

Just Culture Recognizing and Reporting Errors, Near Misses and - - PowerPoint PPT Presentation

Just Culture Recognizing and Reporting Errors, Near Misses and Safety Events Robert McKay, M.D. Department of Anesthesiology KUSM-Wichita Why A Cultural Change? The single greatest impediment to error prevention in the medical industry is


slide-1
SLIDE 1

Just Culture

Recognizing and Reporting Errors, Near Misses and Safety Events

Robert McKay, M.D. Department of Anesthesiology KUSM-Wichita

slide-2
SLIDE 2

Why A Cultural Change?

— The single greatest impediment to error prevention

in the medical industry is “that we punish people for making mistakes.”

– Dr. Lucian Leape, Professor, Harvard School of Public Health in Testimony before Congress

— Insanity: Doing the same thing over and over again

and expecting different results

–Albert Einstein

— Change the people without changing the system

and the problems will continue

– Don Norman, Author of “The Design of Everyday Things”

slide-3
SLIDE 3

Are We Improving?

— Let’s form a few small groups and discuss recent

interactions you have observed between the health care system and a friend or family member’s medical care.

— Did everything go perfectly?

— If not, how did that make you feel? — What could have been different? — Was anything changed to keep a similar event from

happening again?

slide-4
SLIDE 4

Is Our Culture Changing?

— In your groups, now think of yourselves as being

workers in healthcare.

— Discuss, do you feel we are providing safer care?

— What are the impediments to safe care? — Are you being supported by the health care system? — Are you being supported by your co-workers,

managers?

— Are you being supported by your patients?

slide-5
SLIDE 5

What Are Errors?

— Acts of commission or omission leading to

undesirable outcomes or significant potential for such outcomes

— Errors may be active (readily apparent) or latent

(less apparent)

— Latent (less apparent) errors can lead to

“Normalization of Deviance” wherein behaviors leading to such errors become “normal” and stripped of their significance as warnings of impending danger.

slide-6
SLIDE 6

Is It a “Slip” or a “Mistake”?

— A “slip” is a lapse in concentration or

inattentiveness

— Slips are increased by fatigue, stress and

distractions, including emotional distraction

— Mistakes are failures during conscious thought,

analysis and planning

— Methods of addressing mistakes (training,

supervision, discipline) are ineffective and often counterproductive in addressing slips.

slide-7
SLIDE 7

Are Human Errors Inevitable?

— Only two things in the universe are infinite, the

universe and human stupidity, and I’m not sure about the former.

–Albert Einstein

— Yes, Virginia, humans will continue to make errors.

–Apologies to Francis Pharcellus Church (1839–1906)

— Thus, if humans are involved, the system MUST be

designed to either prevent errors or to prevent the adverse outcomes associated with errors. — Errors must be reported and analyzed to improve

safety

slide-8
SLIDE 8

Sources of Human Error

— Irrationality (Rationality = good judgment) — Negligence, conscious disregard of risks (including risks

resultant from an error), gross misconduct (e.g., falsifying records, intoxication, etc.)

— Cognitive Biases (Wikipedia lists about 100 types)

— Heuristics – rules governing judgment or decision making — As short cuts, cognitive biases are used more often in

complex, time pressured (production pressured) systems such as healthcare

— Motivational Biases (wishful thinking) – believing

something is true (or false) simply because you want it to be so. (e.g., Barry Sanders will win in 2016)

slide-9
SLIDE 9

“If you see it on the Internet, it’s So!”

The 1897 version: If you see it in THE SUN, it’s so.

slide-10
SLIDE 10

Heuristics

Mental shortcuts that decrease cognitive load

— Availability Heuristic – mental shortcuts that recall the

most recent information or the first possibility that comes to mind.

— Representativeness Heuristic – similar observations

have similar causes, e.g., fever in the last 2 patients was from atelectasis, thus it must be from atelectasis this time.

— Affect Heuristic – “going with your gut feeling”, e.g., “I

can do this safely this time” (estimate risk is lower than it is) or I’m afraid of this (very rare) outcome –

  • verestimates risk due to fear
slide-11
SLIDE 11

Uses of the Affect Heuristic

— Smiling (and better looking) people are

— More likely to be treated with leniency — Seen as more trustworthy, honest, sincere and admirable

— Negative affect

— Feeling negative increases perceived risk of a negative outcome

— Terrorism in the U.S.

— This also increases the frequency of perceived negative outcome

— Lack of affect heuristic can also lower perceived risk

— Climate change is thought unlikely by those unexposed to

significant weather changes

— Affect (feeling) trumps numbers (statistics)

— Explains why terrorism is more scary than driving even though you

are far more likely to be killed just driving to work

slide-12
SLIDE 12

Is Your “Gut Feeling” this Lecture will be Too Long?

— Mine says yes! — I have confirmation bias – my children tell me I lecture

them far too much!

— I may have a negative affect

— (Some of you may be asleep) So I must ask …

— Am I at fault? — or Did you have a poor night’s sleep? — or Is this mandatory and you have no interest in the subject? — or Did you have a great lunch and have postprandial fatigue?

— All of these might affect my conclusion in a biased

manner

slide-13
SLIDE 13

Common Cognitive Biases That Lead to Errors

— Status quo bias – my stable patient will remain so — Planning fallacy – the tendency to underestimate

task-completion times — Time crunches furthermore increase the use of

cognitive biases as short cuts

— Information bias – the tendency to seek information

even when it cannot affect action

— Focusing effect – placing too much importance on

  • ne aspect of an event
slide-14
SLIDE 14

Normal Accident Theory (Perrow)

— Highly complex settings (e.g., medical care)

— No single operator can immediately foresee the

consequences of a given action +

— Tight coupling of processes

— Must be completed within a certain time period

— Such as a crash cesarean section) =

— Potential for error is intrinsic to the system

— i.e., major accident becomes almost inevitable

slide-15
SLIDE 15

Normal Accident Theory versus A High Reliability Organization

— Though normal accident theory is likely true, it is

also probably that most medical errors are NOT related to the complexity of the system

— Moreover, some organizations are remarkably adept

at avoiding errors – even in complex systems.

— HROs operate with nearly failure-free performance

records, e.g., at Six Sigma (3.4 errors per 1,000,000 events).

slide-16
SLIDE 16

So What Characterizes a High Reliability Organization?

— Preoccupation with failure — Commitment to resilience

— Detecting unexpected threats and containing them

before they can cause harm

— Sensitivity to operations — A culture of safety – can draw attention to hazards,

failures and errors without fear of censure from management.

slide-17
SLIDE 17

How Can Medicine Become Highly Reliable?

— Increased Use of (Unbiased) Technological Aids

— Triggers and Flags, Forcing Functions, Decision Support,

Checklists, Protocols, CPOE, Medication Scanners

— Use of Rapid Response Teams (intervention before harm) — Culture of Safety with Root Cause Analyses and Reporting of

Actual or Potential Safety Breeches, i.e., of Critical Incidents

— Quality Improvement Cycles (e.g., PDSA) to Address Error

Chains

— Team training and Crisis resource management (CRM) — Education, e.g. The Five Rights (right medication, right dose,

right time, right route, right patient), EBM protocols, etc.

slide-18
SLIDE 18

So What Is Just Culture?

— Addressing the twin need for a no-blame culture and

appropriate accountability

— Has an open, transparent environment where human

errors are expected to occur but are uniformly reported and are used as learning events to improve systems and individual behaviors — A culture of safety is foremost — Example: FAA reporting system

— Zero tolerance is given to conscious disregard of clear

risks to patients (e.g., taking shortcuts), reckless behavior (e.g., refusing to perform safety steps, not reporting errors)

  • r gross misconduct

— A purely blameless culture would allow willfully disruptive,

negligent or harmful behavior to persist and lead to patient harm.

slide-19
SLIDE 19

Error Reporting

— Anonymous

— Higher likelihood of errors

being reported

— “Safe” reporting with less

fear of reprisal

— Less concern about need for

legal protection

— Can be associated with

increased level of false reports

— May be malicious and

untrue reports

— Error causes may be difficult

to investigate as you can’t seek additional information

— Identifiable Source

— Errors less likely to be

reported

— A just culture with

punishment for non- reporting can help

— Can verify accuracy of report — Can usually obtain more

details about error including investigation into the error chain

— Less likely to be a false or

malicious report

slide-20
SLIDE 20

Incident Reporting Systems

— Supportive environment that protects privacy of staff

who report — In a fully just culture, such protection of privacy would be

unnecessary

— Any personnel should be able to report — Summaries of reported events must be disseminated in

a timely fashion

— A structured mechanism must be in place for reviewing

reports and developing action plans

— Incident reporting is a passive form of surveillance

— May miss many errors and latent safety problems

slide-21
SLIDE 21

Small Group Discussion

— Please discuss why you might not report an error

that you have made? — If you chose to report it, how would you do it?

— Please discuss why you might not report an error

that you have observed a co-worker make? — If you chose to report it, how would you do it?

slide-22
SLIDE 22

Perceived Barriers to Reporting by Physicians

— No feedback after report given — Forms are difficult to use; lack of time — Incident seemed “trivial” as no patient harm resulted

— Overlooking of latent errors — Leads to culture of low expectations with normalization of deviance

— Heavy clinical load = Forgot to report — Not sure who should complete report — Don’t want to get anyone in trouble (including self)

slide-23
SLIDE 23

Trigger Tools

— Alert providers to probable adverse events — Best triggers alert in real time, i.e., before patient

harm can occur

— Failure Mode and Effect Analysis

— Used to prospectively identify error risk within a

process

— Quantitatively estimates magnitude of hazard posed

by each step – greater threats are addressed first

— In a HRO, all workers are attentive to conditions and

all workers can trigger alerts

slide-24
SLIDE 24

Errors in Complex Systems

— Most complex systems display resiliency

— A single error seldom leads to patient harm

— Error chain – a series of events that leads to a disastrous

  • utcome (the Swiss cheese model)

— Breaking the chain at any point may prevent the bad outcome

— Root cause analysis can categorize the errors into common

links:

— Failure to follow standard procedures — Poor leadership — Breakdowns in communication or teamwork — Overlooking or ignoring individual fallibility — Losing track of objectives

— Team training, bone fish diagrams, etc. can help address the

error chain

slide-25
SLIDE 25

Reducing Slips

Report Problems with Structures or Processes

—

Right Structure + Right Processes = Right Outcomes

—

Structure: —

Close units when appropriate (closed units have better outcomes and lower costs)

—

Clinical information systems

—

Sufficient patient volume to develop expertise

—

Stable staff

—

Sufficient equipment – with consistent design across users and locations

—

Work area design by human factors engineers —

Elimination of distractions in work areas

—

Fatigue management

—

Processes —

Patient and staff education

—

Safety protocols in place (patient identification, marking site, time outs, etc.)

—

Protocols to advance EBM

—

Checklists ensure key steps are not omitted (avoid checklist fatigue)

—

Teamwork training

slide-26
SLIDE 26

Where to Report

— Hospital Notification

System — Errors, Near Misses, Safety

Events

— Risk Management

Department — Errors, Near Misses, Safety

Events

— Residency Program Director

— Errors, Near Misses, Safety

Events

— QI/PS Registries

— Errors, Near Misses, Safety

Events

— Service Line Medical

Director — Near Misses, Safety Events

— Hospital Administrator on

Call — Safety Events

— Safety Hot-Lines

— Safety Events

— Security

— Safety Events

slide-27
SLIDE 27

Role Modeling and Teaching Good Practice

— Always Promote a Culture of Safety

— Level the Playing Field

— Set the authority gradient at an appropriate level to

provide — Patient safety — Sufficient supervision of trainees — Sufficient understanding of roles during team actions — Confidence in all to speak up with safety concerns and to

identify errors, and near misses

— Report own errors, near misses and safety concerns

and encourage others to do the same — Participate in root cause analyses and QI PDSA cycles