EPN ATSEP Workshop 2019 Sten Winther October 2019 Welcome Human - - PowerPoint PPT Presentation
EPN ATSEP Workshop 2019 Sten Winther October 2019 Welcome Human - - PowerPoint PPT Presentation
EPN ATSEP Workshop 2019 Sten Winther October 2019 Welcome Human Factors Or Human Performance 2 Case Boeing Model 299 October 30, 1935, Wright Air Field, Ohio Flight competition next generation long-range bomber. Case Boeing Model
EPN ATSEP Workshop 2019
Sten Winther October 2019
2
Human Factors Or Human Performance
Welcome
Case Boeing Model 299
October 30, 1935, Wright Air Field, Ohio Flight competition – next generation long-range bomber.
Case Boeing Model 299
Investigation: Determined that the cause of the crash was a very simple thing — the control lock (gust lock) had been left in place.
https://www.thisdayinaviation.com/tag/boeing-model-299/
Opinion: Modern planes were simply too complex to operate safely, even by two of the best test pilots in the world.
Boeing Model 299,
Human factors
Human factors and ergonomics (commonly referred to as human factors) is the application of psychological and physiological principles to the (engineering and) design of products, processes, and systems. The goal of human factors is to reduce human error, increase productivity, and enhance safety and comfort with a specific focus on the interaction between the human and the thing of interest.[Wikipedia] Ergonomics (or Human Factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory. principles, data and methods to design in
- rder to optimize human well-being and overall performance.
[International Ergonomic Association, 2016]
Cognitive Bias
Human in the system
System to perform a function
f (x) f (x) = Model of the system
Safety I (Erik Hollnagel) Safety II
Linear Risks emerge in a system that is considered fundamentally safe Technichal failure or Human mistakes Complex interaction Risk and safety come from the same sources Variation
The Machine metaphore The Ecosystems metaphore
Operational performance
Erik Hollnagel Performance Time Limit of uacceptable performance
System design and human error
10% Equipment failure 90% Human Error
70%
System Induced Error
30%
Individual Error
Human Errors Operation Upsets
Todd Conklin
Principles of Human Performance (Todd Conklin)
- 1. People are fallible
- 2. Error-likely situations are predictable
- 3. Individual behaviors are influenced
- 4. Operational upset can be avoided
- 5. Management’s response to failures matters
Safe – Safety Management
Domain for dilemmas
Failure domain Dilemma domain Safe domain
Routine Practice Known conditions High Situation Awareness Comfort zone Change Stability Technical Operation VFR Business Regulation Freedom Staffing Flexibility Culture Procedures Degradation Mitigation Military Civil Failures Faults Mishaps Slips Lapses Mistakes Error
Safety Management
Work as done and work as imagnied
Human Error
Human error 2
Human error 3
20
Safety Differently Safety II, Human Error Four approaches post TMI
- James Reason: Slips, Lapses and Errors;
- The Swiss Cheese Model
- David Woods: Human Error is contested – but does it
matter anyway
- Erik Hollnagel: There is not such thing as Human Error –
it does not exist!
- Sidney Dekker: Drift into failure.
All four were significantly influenced by Jens Rasmussen
The Causality Credo is challenged
..all the foundations of a system that is maintaining a fragile balance on a global scale: producing more, using more complex tools, in more difficult places, inevitably resulting in greater risks. The art of successful intervention in safety involves controlling the compromise and the trade-offs between the benefits of controlled safety and the resulting looses in terms of managed safety.
Amalberti
Reality
Total safety = controlled safety + managed safety
- The level of safety is high, but the adaptive expertise of operators are
consequently reduced as they are no longer exposed to exceptional situations and are no longer trained to work outside their procedural framework.
- Currently no known solution which can preserve both the expertise of the
- perators in exceptional situations and the benefit of achieving maximum
system safety by procedural means.
(Amalberti, 2013)
Total safety in ultra safe systems like ATM
Production vs. safety vs. economic efficiency
Jens Rasmussen 1997
Major socio-technical system
Complex systems
1. Complex systems are intrinsically hazardous systems. 2. Complex systems are heavily and successfully defended against failure. 3. Catastrophe requires multiple failures – single point failures are not enough.. 4. Complex systems contain changing mixtures of failures latent within them. 5. Complex systems run in degraded mode. 6. Catastrophe is always just around the corner. 7. Post-accident attribution accident to a ‘root cause’ is fundamentally wrong. 8. Hindsight biases post-accident assessments of human performance. 9. Human operators have dual roles: as producers & as defenders against failure.
- 10. All practitioner actions are gambles.
- 11. Actions at the sharp end resolve all ambiguity.
- 12. Human practitioners are the adaptable element of complex systems.
- 13. Human expertise in complex systems is constantly changing
- 14. Change introduces new forms of failure.
- 15. Views of ‘cause’ limit the effectiveness of defenses against future events.
- Historical perspective
The human in the system
- Future perspective
The user interface
- r human–machine
interface is the part of the machine that handles the human– machine interaction. Membrane switches, rubber keypads and touchscreens are examples of the physical part of the Human Machine Interface which we can see and touch.
Human machine interface HMI
Example of HMI
- Visibility: By looking, the user can tell the state of the system and the
alternatives for action
- Constraints: The system enforces critical interactions.
- Good Mapping: It is possible to determine the relationship between
the action and their results, between the controls and their effects, and between the system state and what is visible.
- Feedback: The user recieve full and continuous feedback about the
results of their actions.
Criteria for good HMI
Good examples HMI
Good design, is not just a question of good HMI.
- First Principle: Trade offs and work arounds
- Second Principle: The minimal Action Rule
- Third Principle: Form should match function and vice versa
- Fourth Principle: What you look for is what you see
- Fifth Principle: Show what is going on
(Steven Shorrock; Erik Hollnagel ”The Nitty-Gritty of Human Factors”)
The Nitty-Gritty of Human Factors
- a practical approach
Automation
Level of automation
- Tasks that are too difficult to automate are left to humans
- For the most of the time, humans are there to monitor the system
- Operators get de-skilled because of lack of practice
- Whenever something goes wrong, humans are given back control
- Humans are removed from
the system because they are not reliable, but they are asked to intervene in the most difficult situations.
Bainbridge, 1987
Ironies of automation
“Manual control is a highly skilled activity, and skills need to be practiced continuously in order to maintain them. Yet an automatic control system that fails only rarely denies
- perators the opportunity for practicing
these basic control skills … when manual takeover is necessary something has usually gone wrong; this means that operators need to be more rather than less skilled in order to cope with these atypical conditions.”
Automation paradox
- Demand for quick response
- Conditions Complexity and
distribution of technology
Team Work
“Purpose affirms trust, trust affirms purpose, and together they forge individuals into a working team.”
― Stanley McChrystal, Team of Teams: New Rules of Engagement for a Complex World
- https://www.youtube.com/watch?v=biNebig1gUI
Video ThyssenKrupp
- Human beings, viewed as
behaving systems, are quite simple. The apparent complexity of our behavior
- ver time is largely a
reflection of the complexity of the environment in which we find ourselves.
(Simon 1996)
Systems thinking
- Early focus on users, activities and context.
- Active involvement of users.
- Appropriate allocation of function between user and system.
- Incorporate of user-derived feedback into system design.
- Iterative design.
User-centered design principles
- MIL-STD-1472G
- FAA System Safety Handbook, Chapter 17: Human Factors Principles & Practices
December 30, 2000
- ICAO Human Factor Digest No. 11 — Human Factors in CNS/ATM Systems (Circular 249)
- ISO 9241-210:2010
- ISO 6385:2004
- CANSO Human Performance Standard of Excellence
- CAA CAP 1377 (Automation)