Safety models & accident models Eric Marsden - - PowerPoint PPT Presentation

safety models accident models
SMART_READER_LITE
LIVE PREVIEW

Safety models & accident models Eric Marsden - - PowerPoint PPT Presentation

Safety models & accident models Eric Marsden <eric.marsden@risk-engineering.org> A safety model is a set of beliefs or hypotheses (ofuen implicit) about the features and conditions that contribute to the safety of a system An


slide-1
SLIDE 1

Safety models & accident models

Eric Marsden

<eric.marsden@risk-engineering.org>

slide-2
SLIDE 2

Mental models

▷ A safety model is a set of beliefs or hypotheses (ofuen implicit)

about the features and conditions that contribute to the safety of a system

▷ An accident model is a set of beliefs on the way in which

accidents occur in a system

▷ Mental models are important because they impact system

design, operational decisions and behaviours

2 / 18
slide-3
SLIDE 3

Accidents as “acts of god”

▷ Fatalism: “you can’t escape your fate” ▷ Defensive atuitude: accidents occur due to circumstances

“beyond our control”

▷ Notion that appeared in Roman law: reasons that could

exclude a person from absolute liability

  • e.g. violent storms & pirates exempted a captain from

responsibility for his cargo

3 / 18
slide-4
SLIDE 4

Simple sequential accident model

  • H. Heinrich’s domino model

(1930) Assumptions:

▷ Accidents arise from a

quasi-mechanical sequence of events or circumstances, that

  • ccur in a well-defjned order

▷ An accident can be prevented

by removing one of the “dominos” in the causal sequence

4 / 18
slide-5
SLIDE 5

Simple sequential accident model

Tie “safety pyramid” or “accident triangle” (H. Heinrich, 1930 and F. Bird, 1970) Assumptions:

▷ Each incident is an “embryo” of an accident

(the mechanisms which cause minor incidents are the same as those that create major accidents)

▷ Reducing the frequency of minor incidents

will reduce the probability of a major accident

▷ Accidents can be prevented by identifying

and eliminating possible causes

5 / 18
slide-6
SLIDE 6

Simple sequential accident model

According to this model, safety is improved by identifying and eliminating “rotuen apples”

▷ front-line stafg who generate “human errors” ▷ whose negligent atuitude might propagate to other

stafg

Some accidents (in particular in high-risk systems) have more complicated origins…

6 / 18
slide-7
SLIDE 7

On “human error”

‘‘

for a long time people were saying most accidents were due to human error and this is true in a sense but it’s not very helpful. It’s a bit like saying that falls are due to gravity… — Trevor Kletz

A useful alternative concept to human error is performance variability.

7 / 18
slide-8
SLIDE 8

Is it relevant to count errors?

▷ Counting errors produces a quantitative assessment of the “safety level” of a system ▷ Allows inter-comparison of systems ▷ Can constitute the point of departure for a search for the underlying causes of incidents

number of errors safety level

quantity quality inverse relationship

Tiis simplistic model is very criticized

8 / 18
slide-9
SLIDE 9

Is counting errors relevant?

Who is more dangerous?

▷ 700 000 doctors in the USA ▷ between 44 000 and 98 000

people die each year from a medical error → between 0.063 and 0.14 accidental deaths per doctor per year

▷ 80 million firearm owners in

the USA

▷ responsible for ≈1 500

accidental deaths per year → 0,000019 accidental deaths per firearm owner per year

The probability that the human error of a doctor kills someone is 7500 times higher than for a firearm owner. [S. Dekker] 9 / 18
slide-10
SLIDE 10

Is counting errors relevant?

▷ 700 000 doctors in the USA ▷ between 44 000 and 98 000

people die each year from a medical error → between 0.063 and 0.14 accidental deaths per doctor per year

▷ 80 million firearm owners in

the USA

▷ responsible for ≈1 500

accidental deaths per year → 0,000019 accidental deaths per firearm owner per year

The probability that the human error of a doctor kills someone is 7500 times higher than for a firearm owner. [S. Dekker] 9 / 18
slide-11
SLIDE 11

Is counting errors relevant?

▷ 700 000 doctors in the USA ▷ between 44 000 and 98 000

people die each year from a medical error → between 0.063 and 0.14 accidental deaths per doctor per year

▷ 80 million firearm owners in

the USA

▷ responsible for ≈1 500

accidental deaths per year → 0,000019 accidental deaths per firearm owner per year

The probability that the human error of a doctor kills someone is 7500 times higher than for a firearm owner. [S. Dekker] 9 / 18
slide-12
SLIDE 12

Is counting errors relevant?

▷ 700 000 doctors in the USA ▷ between 44 000 and 98 000

people die each year from a medical error → between 0.063 and 0.14 accidental deaths per doctor per year

▷ 80 million firearm owners in

the USA

▷ responsible for ≈1 500

accidental deaths per year → 0,000019 accidental deaths per firearm owner per year

The probability that the human error of a doctor kills someone is 7500 times higher than for a firearm owner. [S. Dekker] 9 / 18
slide-13
SLIDE 13

Epidemiological accident model

accident event incident technical barriers from "Human Error" (James Reason) safety management systems and procedures sharp-end workers c
  • p
e r a t i
  • n

James Reason’s Swiss cheese model Assumption: accidents are produced by a combination of active errors (poor safety behaviours) and latent conditions (environmental factors) Consequences: prevent accidents by reinforcing barriers. Safety management requires monitoring via performance indicators.

10 / 18
slide-14
SLIDE 14

Bow-tie model causes impacts top event preventive barriers protective barriers

11 / 18
slide-15
SLIDE 15

Bow-tie model causes impacts top event

no flow to receiver no flow from component B no flow into component B no flow from com- ponent A1 no flow from source1 component A1 blocks flow no flow from com- ponent A2 no flow from source2 component A2 blocks flow component B blocks flow

fault tree event tree

11 / 18
slide-16
SLIDE 16

Bow tie diagram

12 / 18
slide-17
SLIDE 17

Bow-tie: example

13 / 18
slide-18
SLIDE 18

Loss of control accident model

PREVENTION RECOVERY ACCIDENT MITIGATION Destabilization point

Figure source: French BEA 14 / 18
slide-19
SLIDE 19

Drifu into failure

economic failure unacceptable workload unsafe space of possibilities

Human behaviour in any large system is shaped by constraints: profjtable operations, safe

  • perations, feasible workload.

Actors experiment within the space formed by these constraints.

management pressure for effjciency

Human behaviour in any large system is shaped by constraints: profjtable activity, safe operations, feasible workload. Actors experiment within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency.

gradient towards least efgort

Human behaviour in any large system is shaped by constraints: economic, safety, feasible

  • workload. Actors experiment

within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency. Workers will seek to maximize the effjciency of their work, with a gradient in the direction of reduced workload.

drifu towards failure

Tiese pressures push work to migrate towards the limits of acceptable (safe) performance. Accidents occur when the system’s activity crosses the boundary into unacceptable safety. A process of “normalization of deviance” means that deviations from the safety procedures established during system design progressively become acceptable, then standard ways of working.

efgect of a “questioning attitude” safety margin

Mature high-hazard systems apply the defence in depth design principle and implement multiple independent safety barriers. Tiey also put in place programmes aimed at reinforcing people’s questioning attitude and their chronic unease, making them more sensitive to safety issues. Tiese shifu the perceived boundary of safe performance to the right. Tie difgerence between the minimally acceptable level

  • f safe performance and the

boundary at which safety barriers are triggered is the safety margin.

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2) 15 / 18
slide-20
SLIDE 20

Drifu into failure

economic failure unacceptable workload unsafe space of possibilities

Human behaviour in any large system is shaped by constraints: profjtable operations, safe

  • perations, feasible workload.

Actors experiment within the space formed by these constraints.

management pressure for effjciency

Human behaviour in any large system is shaped by constraints: profjtable activity, safe operations, feasible workload. Actors experiment within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency.

gradient towards least efgort

Human behaviour in any large system is shaped by constraints: economic, safety, feasible

  • workload. Actors experiment

within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency. Workers will seek to maximize the effjciency of their work, with a gradient in the direction of reduced workload.

drifu towards failure

Tiese pressures push work to migrate towards the limits of acceptable (safe) performance. Accidents occur when the system’s activity crosses the boundary into unacceptable safety. A process of “normalization of deviance” means that deviations from the safety procedures established during system design progressively become acceptable, then standard ways of working.

efgect of a “questioning attitude” safety margin

Mature high-hazard systems apply the defence in depth design principle and implement multiple independent safety barriers. Tiey also put in place programmes aimed at reinforcing people’s questioning attitude and their chronic unease, making them more sensitive to safety issues. Tiese shifu the perceived boundary of safe performance to the right. Tie difgerence between the minimally acceptable level

  • f safe performance and the

boundary at which safety barriers are triggered is the safety margin.

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2) 15 / 18
slide-21
SLIDE 21

Drifu into failure

economic failure unacceptable workload unsafe space of possibilities

Human behaviour in any large system is shaped by constraints: profjtable operations, safe

  • perations, feasible workload.

Actors experiment within the space formed by these constraints.

management pressure for effjciency

Human behaviour in any large system is shaped by constraints: profjtable activity, safe operations, feasible workload. Actors experiment within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency.

gradient towards least efgort

Human behaviour in any large system is shaped by constraints: economic, safety, feasible

  • workload. Actors experiment

within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency. Workers will seek to maximize the effjciency of their work, with a gradient in the direction of reduced workload.

drifu towards failure

Tiese pressures push work to migrate towards the limits of acceptable (safe) performance. Accidents occur when the system’s activity crosses the boundary into unacceptable safety. A process of “normalization of deviance” means that deviations from the safety procedures established during system design progressively become acceptable, then standard ways of working.

efgect of a “questioning attitude” safety margin

Mature high-hazard systems apply the defence in depth design principle and implement multiple independent safety barriers. Tiey also put in place programmes aimed at reinforcing people’s questioning attitude and their chronic unease, making them more sensitive to safety issues. Tiese shifu the perceived boundary of safe performance to the right. Tie difgerence between the minimally acceptable level

  • f safe performance and the

boundary at which safety barriers are triggered is the safety margin.

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2) 15 / 18
slide-22
SLIDE 22

Drifu into failure

economic failure unacceptable workload unsafe space of possibilities

Human behaviour in any large system is shaped by constraints: profjtable operations, safe

  • perations, feasible workload.

Actors experiment within the space formed by these constraints.

management pressure for effjciency

Human behaviour in any large system is shaped by constraints: profjtable activity, safe operations, feasible workload. Actors experiment within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency.

gradient towards least efgort

Human behaviour in any large system is shaped by constraints: economic, safety, feasible

  • workload. Actors experiment

within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency. Workers will seek to maximize the effjciency of their work, with a gradient in the direction of reduced workload.

drifu towards failure

Tiese pressures push work to migrate towards the limits of acceptable (safe) performance. Accidents occur when the system’s activity crosses the boundary into unacceptable safety. A process of “normalization of deviance” means that deviations from the safety procedures established during system design progressively become acceptable, then standard ways of working.

efgect of a “questioning attitude” safety margin

Mature high-hazard systems apply the defence in depth design principle and implement multiple independent safety barriers. Tiey also put in place programmes aimed at reinforcing people’s questioning attitude and their chronic unease, making them more sensitive to safety issues. Tiese shifu the perceived boundary of safe performance to the right. Tie difgerence between the minimally acceptable level

  • f safe performance and the

boundary at which safety barriers are triggered is the safety margin.

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2) 15 / 18
slide-23
SLIDE 23

Drifu into failure

economic failure unacceptable workload unsafe space of possibilities

Human behaviour in any large system is shaped by constraints: profjtable operations, safe

  • perations, feasible workload.

Actors experiment within the space formed by these constraints.

management pressure for effjciency

Human behaviour in any large system is shaped by constraints: profjtable activity, safe operations, feasible workload. Actors experiment within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency.

gradient towards least efgort

Human behaviour in any large system is shaped by constraints: economic, safety, feasible

  • workload. Actors experiment

within the space formed by these constraints. Management will provide a “cost gradient” which pushes activity towards economic effjciency. Workers will seek to maximize the effjciency of their work, with a gradient in the direction of reduced workload.

drifu towards failure

Tiese pressures push work to migrate towards the limits of acceptable (safe) performance. Accidents occur when the system’s activity crosses the boundary into unacceptable safety. A process of “normalization of deviance” means that deviations from the safety procedures established during system design progressively become acceptable, then standard ways of working.

efgect of a “questioning attitude” safety margin

Mature high-hazard systems apply the defence in depth design principle and implement multiple independent safety barriers. Tiey also put in place programmes aimed at reinforcing people’s questioning attitude and their chronic unease, making them more sensitive to safety issues. Tiese shifu the perceived boundary of safe performance to the right. Tie difgerence between the minimally acceptable level

  • f safe performance and the

boundary at which safety barriers are triggered is the safety margin.

Figure adapted from Risk management in a dynamic society, J. Rasmussen, Safety Science, 1997:27(2) 15 / 18
slide-24
SLIDE 24

Non-linear accident model

Systemic models

▷ FRAM (Hollnagel, 2000) ▷ STAMP (Leveson, 2004)

Assumption: accidents result from an unexpected combination and the resonance of normal variations in performance Consequences: preventing accidents means understanding and monitoring performance

  • variations. Safety requires the ability to anticipate future events and react appropriately.
16 / 18
slide-25
SLIDE 25

Image credits

▷ Sodom and Gomorrah burning (slide 26): Picu Pătruţ, public domain, via

Wikimedia Commons

▷ Dominos (slide 27): H. Heinrich, Industrial Accident Prevention: A Scientifjc

Approach, 1931

For more free content on risk engineering, visit risk-engineering.org

17 / 18
slide-26
SLIDE 26

Feedback welcome!

Was some of the content unclear? Which parts were most useful to you? Your comments to feedback@risk-engineering.org (email) or @LearnRiskEng (Twituer) will help us to improve these

  • materials. Tianks!
@LearnRiskEng fb.me/RiskEngineering This presentation is distributed under the terms of the Creative Commons Aturibution – Share Alike licence

For more free content on risk engineering, visit risk-engineering.org

18 / 18