Just Culture
Recognizing and Reporting Errors, Near Misses and Safety Events
Robert McKay, M.D. Department of Anesthesiology KUSM-Wichita
Just Culture Recognizing and Reporting Errors, Near Misses and - - PowerPoint PPT Presentation
Just Culture Recognizing and Reporting Errors, Near Misses and Safety Events Robert McKay, M.D. Department of Anesthesiology KUSM-Wichita Why A Cultural Change? The single greatest impediment to error prevention in the medical industry is
Robert McKay, M.D. Department of Anesthesiology KUSM-Wichita
– Dr. Lucian Leape, Professor, Harvard School of Public Health in Testimony before Congress
–Albert Einstein
– Don Norman, Author of “The Design of Everyday Things”
happening again?
managers?
–Albert Einstein
–Apologies to Francis Pharcellus Church (1839–1906)
safety
resultant from an error), gross misconduct (e.g., falsifying records, intoxication, etc.)
Heuristics – rules governing judgment or decision making As short cuts, cognitive biases are used more often in
complex, time pressured (production pressured) systems such as healthcare
something is true (or false) simply because you want it to be so. (e.g., Barry Sanders will win in 2016)
The 1897 version: If you see it in THE SUN, it’s so.
most recent information or the first possibility that comes to mind.
have similar causes, e.g., fever in the last 2 patients was from atelectasis, thus it must be from atelectasis this time.
can do this safely this time” (estimate risk is lower than it is) or I’m afraid of this (very rare) outcome –
Smiling (and better looking) people are
More likely to be treated with leniency Seen as more trustworthy, honest, sincere and admirable
Negative affect
Feeling negative increases perceived risk of a negative outcome
Terrorism in the U.S.
This also increases the frequency of perceived negative outcome
Lack of affect heuristic can also lower perceived risk
Climate change is thought unlikely by those unexposed to
significant weather changes
Affect (feeling) trumps numbers (statistics)
Explains why terrorism is more scary than driving even though you
are far more likely to be killed just driving to work
Am I at fault? or Did you have a poor night’s sleep? or Is this mandatory and you have no interest in the subject? or Did you have a great lunch and have postprandial fatigue?
cognitive biases as short cuts
consequences of a given action +
Such as a crash cesarean section) =
before they can cause harm
Increased Use of (Unbiased) Technological Aids
Triggers and Flags, Forcing Functions, Decision Support,
Checklists, Protocols, CPOE, Medication Scanners
Use of Rapid Response Teams (intervention before harm) Culture of Safety with Root Cause Analyses and Reporting of
Actual or Potential Safety Breeches, i.e., of Critical Incidents
Quality Improvement Cycles (e.g., PDSA) to Address Error
Chains
Team training and Crisis resource management (CRM) Education, e.g. The Five Rights (right medication, right dose,
right time, right route, right patient), EBM protocols, etc.
appropriate accountability
errors are expected to occur but are uniformly reported and are used as learning events to improve systems and individual behaviors A culture of safety is foremost Example: FAA reporting system
risks to patients (e.g., taking shortcuts), reckless behavior (e.g., refusing to perform safety steps, not reporting errors)
A purely blameless culture would allow willfully disruptive,
negligent or harmful behavior to persist and lead to patient harm.
Anonymous
Higher likelihood of errors
being reported
“Safe” reporting with less
fear of reprisal
Less concern about need for
legal protection
Can be associated with
increased level of false reports
May be malicious and
untrue reports
Error causes may be difficult
to investigate as you can’t seek additional information
Identifiable Source
Errors less likely to be
reported
A just culture with
punishment for non- reporting can help
Can verify accuracy of report Can usually obtain more
details about error including investigation into the error chain
Less likely to be a false or
malicious report
who report In a fully just culture, such protection of privacy would be
unnecessary
a timely fashion
reports and developing action plans
May miss many errors and latent safety problems
No feedback after report given Forms are difficult to use; lack of time Incident seemed “trivial” as no patient harm resulted
Overlooking of latent errors Leads to culture of low expectations with normalization of deviance
Heavy clinical load = Forgot to report Not sure who should complete report Don’t want to get anyone in trouble (including self)
process
by each step – greater threats are addressed first
all workers can trigger alerts
Most complex systems display resiliency
A single error seldom leads to patient harm
Error chain – a series of events that leads to a disastrous
Breaking the chain at any point may prevent the bad outcome
Root cause analysis can categorize the errors into common
links:
Failure to follow standard procedures Poor leadership Breakdowns in communication or teamwork Overlooking or ignoring individual fallibility Losing track of objectives
Team training, bone fish diagrams, etc. can help address the
error chain
Right Structure + Right Processes = Right Outcomes
Structure:
Close units when appropriate (closed units have better outcomes and lower costs)
Clinical information systems
Sufficient patient volume to develop expertise
Stable staff
Sufficient equipment – with consistent design across users and locations
Work area design by human factors engineers
Elimination of distractions in work areas
Fatigue management
Processes
Patient and staff education
Safety protocols in place (patient identification, marking site, time outs, etc.)
Protocols to advance EBM
Checklists ensure key steps are not omitted (avoid checklist fatigue)
Teamwork training
Hospital Notification
System Errors, Near Misses, Safety
Events
Risk Management
Department Errors, Near Misses, Safety
Events
Residency Program Director
Errors, Near Misses, Safety
Events
QI/PS Registries
Errors, Near Misses, Safety
Events
Service Line Medical
Director Near Misses, Safety Events
Hospital Administrator on
Call Safety Events
Safety Hot-Lines
Safety Events
Security
Safety Events
Set the authority gradient at an appropriate level to
provide Patient safety Sufficient supervision of trainees Sufficient understanding of roles during team actions Confidence in all to speak up with safety concerns and to
identify errors, and near misses
and encourage others to do the same Participate in root cause analyses and QI PDSA cycles