EPN ATSEP Workshop 2019 Sten Winther October 2019 Welcome Human - - PowerPoint PPT Presentation

epn atsep workshop 2019
SMART_READER_LITE
LIVE PREVIEW

EPN ATSEP Workshop 2019 Sten Winther October 2019 Welcome Human - - PowerPoint PPT Presentation

EPN ATSEP Workshop 2019 Sten Winther October 2019 Welcome Human Factors Or Human Performance 2 Case Boeing Model 299 October 30, 1935, Wright Air Field, Ohio Flight competition next generation long-range bomber. Case Boeing Model


slide-1
SLIDE 1
slide-2
SLIDE 2

EPN ATSEP Workshop 2019

Sten Winther October 2019

slide-3
SLIDE 3

2

Human Factors Or Human Performance

Welcome

slide-4
SLIDE 4

Case Boeing Model 299

October 30, 1935, Wright Air Field, Ohio Flight competition – next generation long-range bomber.

slide-5
SLIDE 5

Case Boeing Model 299

Investigation: Determined that the cause of the crash was a very simple thing — the control lock (gust lock) had been left in place.

https://www.thisdayinaviation.com/tag/boeing-model-299/

slide-6
SLIDE 6

Opinion: Modern planes were simply too complex to operate safely, even by two of the best test pilots in the world.

Boeing Model 299,

slide-7
SLIDE 7

Human factors

Human factors and ergonomics (commonly referred to as human factors) is the application of psychological and physiological principles to the (engineering and) design of products, processes, and systems. The goal of human factors is to reduce human error, increase productivity, and enhance safety and comfort with a specific focus on the interaction between the human and the thing of interest.[Wikipedia] Ergonomics (or Human Factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory. principles, data and methods to design in

  • rder to optimize human well-being and overall performance.

[International Ergonomic Association, 2016]

slide-8
SLIDE 8

Cognitive Bias

slide-9
SLIDE 9

Human in the system

slide-10
SLIDE 10

System to perform a function

f (x) f (x) = Model of the system

slide-11
SLIDE 11

Safety I (Erik Hollnagel) Safety II

Linear Risks emerge in a system that is considered fundamentally safe Technichal failure or Human mistakes Complex interaction Risk and safety come from the same sources Variation

The Machine metaphore The Ecosystems metaphore

slide-12
SLIDE 12
slide-13
SLIDE 13

Operational performance

Erik Hollnagel Performance Time Limit of uacceptable performance

slide-14
SLIDE 14

System design and human error

10% Equipment failure 90% Human Error

70%

System Induced Error

30%

Individual Error

Human Errors Operation Upsets

Todd Conklin

slide-15
SLIDE 15

Principles of Human Performance (Todd Conklin)

  • 1. People are fallible
  • 2. Error-likely situations are predictable
  • 3. Individual behaviors are influenced
  • 4. Operational upset can be avoided
  • 5. Management’s response to failures matters
slide-16
SLIDE 16

Safe – Safety Management

Domain for dilemmas

Failure domain Dilemma domain Safe domain

Routine Practice Known conditions High Situation Awareness Comfort zone Change  Stability Technical  Operation VFR  Business Regulation  Freedom Staffing  Flexibility Culture  Procedures Degradation  Mitigation Military  Civil Failures Faults Mishaps Slips Lapses Mistakes Error

Safety Management

slide-17
SLIDE 17

Work as done and work as imagnied

slide-18
SLIDE 18

Human Error

slide-19
SLIDE 19

Human error 2

slide-20
SLIDE 20

Human error 3

slide-21
SLIDE 21

20

Safety Differently Safety II, Human Error Four approaches post TMI

  • James Reason: Slips, Lapses and Errors;
  • The Swiss Cheese Model
  • David Woods: Human Error is contested – but does it

matter anyway

  • Erik Hollnagel: There is not such thing as Human Error –

it does not exist!

  • Sidney Dekker: Drift into failure.

All four were significantly influenced by Jens Rasmussen

The Causality Credo is challenged

slide-22
SLIDE 22

..all the foundations of a system that is maintaining a fragile balance on a global scale: producing more, using more complex tools, in more difficult places, inevitably resulting in greater risks. The art of successful intervention in safety involves controlling the compromise and the trade-offs between the benefits of controlled safety and the resulting looses in terms of managed safety.

Amalberti

Reality

slide-23
SLIDE 23

Total safety = controlled safety + managed safety

  • The level of safety is high, but the adaptive expertise of operators are

consequently reduced as they are no longer exposed to exceptional situations and are no longer trained to work outside their procedural framework.

  • Currently no known solution which can preserve both the expertise of the
  • perators in exceptional situations and the benefit of achieving maximum

system safety by procedural means.

(Amalberti, 2013)

Total safety in ultra safe systems like ATM

slide-24
SLIDE 24

Production vs. safety vs. economic efficiency

Jens Rasmussen 1997

slide-25
SLIDE 25

Major socio-technical system

slide-26
SLIDE 26

Complex systems

1. Complex systems are intrinsically hazardous systems. 2. Complex systems are heavily and successfully defended against failure. 3. Catastrophe requires multiple failures – single point failures are not enough.. 4. Complex systems contain changing mixtures of failures latent within them. 5. Complex systems run in degraded mode. 6. Catastrophe is always just around the corner. 7. Post-accident attribution accident to a ‘root cause’ is fundamentally wrong. 8. Hindsight biases post-accident assessments of human performance. 9. Human operators have dual roles: as producers & as defenders against failure.

  • 10. All practitioner actions are gambles.
  • 11. Actions at the sharp end resolve all ambiguity.
  • 12. Human practitioners are the adaptable element of complex systems.
  • 13. Human expertise in complex systems is constantly changing
  • 14. Change introduces new forms of failure.
  • 15. Views of ‘cause’ limit the effectiveness of defenses against future events.
slide-27
SLIDE 27
  • Historical perspective

The human in the system

  • Future perspective
slide-28
SLIDE 28

The user interface

  • r human–machine

interface is the part of the machine that handles the human– machine interaction. Membrane switches, rubber keypads and touchscreens are examples of the physical part of the Human Machine Interface which we can see and touch.

Human machine interface HMI

slide-29
SLIDE 29

Example of HMI

slide-30
SLIDE 30
  • Visibility: By looking, the user can tell the state of the system and the

alternatives for action

  • Constraints: The system enforces critical interactions.
  • Good Mapping: It is possible to determine the relationship between

the action and their results, between the controls and their effects, and between the system state and what is visible.

  • Feedback: The user recieve full and continuous feedback about the

results of their actions.

Criteria for good HMI

slide-31
SLIDE 31

Good examples HMI

Good design, is not just a question of good HMI.

slide-32
SLIDE 32
  • First Principle: Trade offs and work arounds
  • Second Principle: The minimal Action Rule
  • Third Principle: Form should match function and vice versa
  • Fourth Principle: What you look for is what you see
  • Fifth Principle: Show what is going on

(Steven Shorrock; Erik Hollnagel ”The Nitty-Gritty of Human Factors”)

The Nitty-Gritty of Human Factors

  • a practical approach
slide-33
SLIDE 33

Automation

slide-34
SLIDE 34

Level of automation

slide-35
SLIDE 35
  • Tasks that are too difficult to automate are left to humans
  • For the most of the time, humans are there to monitor the system
  • Operators get de-skilled because of lack of practice
  • Whenever something goes wrong, humans are given back control
  • Humans are removed from

the system because they are not reliable, but they are asked to intervene in the most difficult situations.

Bainbridge, 1987

Ironies of automation

slide-36
SLIDE 36

“Manual control is a highly skilled activity, and skills need to be practiced continuously in order to maintain them. Yet an automatic control system that fails only rarely denies

  • perators the opportunity for practicing

these basic control skills … when manual takeover is necessary something has usually gone wrong; this means that operators need to be more rather than less skilled in order to cope with these atypical conditions.”

Automation paradox

slide-37
SLIDE 37
  • Demand for quick response
  • Conditions Complexity and

distribution of technology

Team Work

“Purpose affirms trust, trust affirms purpose, and together they forge individuals into a working team.”

― Stanley McChrystal, Team of Teams: New Rules of Engagement for a Complex World

slide-38
SLIDE 38
  • https://www.youtube.com/watch?v=biNebig1gUI

Video ThyssenKrupp

slide-39
SLIDE 39
  • Human beings, viewed as

behaving systems, are quite simple. The apparent complexity of our behavior

  • ver time is largely a

reflection of the complexity of the environment in which we find ourselves.

(Simon 1996)

Systems thinking

slide-40
SLIDE 40
  • Early focus on users, activities and context.
  • Active involvement of users.
  • Appropriate allocation of function between user and system.
  • Incorporate of user-derived feedback into system design.
  • Iterative design.

User-centered design principles

slide-41
SLIDE 41
  • MIL-STD-1472G
  • FAA System Safety Handbook, Chapter 17: Human Factors Principles & Practices

December 30, 2000

  • ICAO Human Factor Digest No. 11 — Human Factors in CNS/ATM Systems (Circular 249)
  • ISO 9241-210:2010
  • ISO 6385:2004
  • CANSO Human Performance Standard of Excellence
  • CAA CAP 1377 (Automation)

Standards

slide-42
SLIDE 42

Accident and incidents

Although the additional flight crew were a valuable resource, had they not been available the primary flight crew would have likely responded to the situation in a similar manner. However, the gathering of information to assist in decision making would have required the use of alternative resources and methods.

slide-43
SLIDE 43

Accident and incidents

slide-44
SLIDE 44

Accident and incidents

When multiple failures or out-of-limits conditions were detected by ECAM they were prioritised according to the programmed ECAM logic.106 In that event, the highest priority procedure relating to the sensed condition was displayed first.

slide-45
SLIDE 45

? !