epn atsep workshop 2019
play

EPN ATSEP Workshop 2019 Sten Winther October 2019 Welcome Human - PowerPoint PPT Presentation

EPN ATSEP Workshop 2019 Sten Winther October 2019 Welcome Human Factors Or Human Performance 2 Case Boeing Model 299 October 30, 1935, Wright Air Field, Ohio Flight competition next generation long-range bomber. Case Boeing Model


  1. EPN ATSEP Workshop 2019 Sten Winther October 2019

  2. Welcome Human Factors Or Human Performance 2

  3. Case Boeing Model 299 October 30, 1935, Wright Air Field, Ohio Flight competition – next generation long-range bomber.

  4. Case Boeing Model 299 Investigation: Determined that the cause of the crash was a very simple thing — the control lock (gust lock) had been left in place. https://www.thisdayinaviation.com/tag/boeing-model-299/

  5. Boeing Model 299, Opinion: Modern planes were simply too complex to operate safely, even by two of the best test pilots in the world.

  6. Human factors Human factors and ergonomics (commonly referred to as human factors ) is the application of psychological and physiological principles to the (engineering and) design of products, processes, and systems. The goal of human factors is to reduce human error, increase productivity, and enhance safety and comfort with a specific focus on the interaction between the human and the thing of interest. [Wikipedia] Ergonomics (or Human Factors) is the scientific discipline concerned with the understanding of interactions among humans and other elements of a system, and the profession that applies theory. principles, data and methods to design in order to optimize human well-being and overall performance. [International Ergonomic Association, 2016]

  7. Cognitive Bias

  8. Human in the system

  9. System to perform a function f (x) f (x) = Model of the system

  10. Safety I (Erik Hollnagel) Safety II The Ecosystems metaphore The Machine metaphore Linear Complex interaction Risks emerge in a system that is considered Risk and safety come from the same sources fundamentally safe Variation Technichal failure or Human mistakes

  11. Operational performance Performance Time Limit of uacceptable performance Erik Hollnagel

  12. System design and human error Human Errors 30% 70% Individual System Error Operation Upsets Induced Error 90% Human 10% Equipment Error failure Todd Conklin

  13. Principles of Human Performance (Todd Conklin) 1. People are fallible 2. Error-likely situations are predictable 3. Individual behaviors are influenced 4. Operational upset can be avoided 5. Management’s response to failures matters

  14. Safe – Safety Management Domain for dilemmas Routine Safe Practice Safety domain Known conditions Management High Situation Awareness Comfort zone Change  Stability Dilemma Technical  Operation domain VFR  Business Regulation  Freedom Staffing  Flexibility Culture  Procedures Degradation  Mitigation Military  Civil Failures Failure Faults domain Mishaps Slips Lapses Mistakes Error

  15. Work as done and work as imagnied

  16. Human Error

  17. Human error 2

  18. Human error 3

  19. Safety Differently Safety II, Human Error Four approaches post TMI • James Reason: Slips, Lapses and Errors; • The Swiss Cheese Model • David Woods: Human Error is contested – but does it matter anyway • Erik Hollnagel: There is not such thing as Human Error – it does not exist! • Sidney Dekker: Drift into failure. All four were significantly influenced by Jens Rasmussen The Causality Credo is challenged 20

  20. Reality ..all the foundations of a system that is maintaining a fragile balance on a global scale: producing more, using more complex tools, in more difficult places, inevitably resulting in greater risks. The art of successful intervention in safety involves controlling the compromise and the trade-offs between the benefits of controlled safety and the resulting looses in terms of managed safety. Amalberti

  21. Total safety in ultra safe systems like ATM Total safety = controlled safety + managed safety • The level of safety is high, but the adaptive expertise of operators are consequently reduced as they are no longer exposed to exceptional situations and are no longer trained to work outside their procedural framework. • Currently no known solution which can preserve both the expertise of the operators in exceptional situations and the benefit of achieving maximum system safety by procedural means. (Amalberti, 2013)

  22. Production vs. safety vs. economic efficiency Jens Rasmussen 1997

  23. Major socio-technical system

  24. Complex systems 1. Complex systems are intrinsically hazardous systems. 2. Complex systems are heavily and successfully defended against failure. 3. Catastrophe requires multiple failures – single point failures are not enough.. 4. Complex systems contain changing mixtures of failures latent within them. 5. Complex systems run in degraded mode. 6. Catastrophe is always just around the corner. 7. Post-accident attribution accident to a ‘root cause’ is fundamentally wrong. 8. Hindsight biases post-accident assessments of human performance. 9. Human operators have dual roles: as producers & as defenders against failure. 10. All practitioner actions are gambles. 11. Actions at the sharp end resolve all ambiguity. 12. Human practitioners are the adaptable element of complex systems. 13. Human expertise in complex systems is constantly changing 14. Change introduces new forms of failure. 15. Views of ‘cause’ limit the effectiveness of defenses against future events.

  25. The human in the system • Historical perspective • Future perspective

  26. Human machine interface HMI The user interface or human–machine interface is the part of the machine that handles the human– machine interaction. Membrane switches, rubber keypads and touchscreens are examples of the physical part of the Human Machine Interface which we can see and touch.

  27. Example of HMI

  28. Criteria for good HMI • Visibility: By looking, the user can tell the state of the system and the alternatives for action • Constraints: The system enforces critical interactions. • Good Mapping: It is possible to determine the relationship between the action and their results, between the controls and their effects, and between the system state and what is visible. • Feedback: The user recieve full and continuous feedback about the results of their actions.

  29. Good examples HMI Good design, is not just a question of good HMI.

  30. The Nitty-Gritty of Human Factors - a practical approach • First Principle: Trade offs and work arounds • Second Principle: The minimal Action Rule • Third Principle: Form should match function and vice versa • Fourth Principle: What you look for is what you see • Fifth Principle: Show what is going on (Steven Shorrock; Erik Hollnagel ”The Nitty-Gritty of Human Factors”)

  31. Automation

  32. Level of automation

  33. Ironies of automation • Tasks that are too difficult to automate are left to humans • For the most of the time, humans are there to monitor the system • Operators get de-skilled because of lack of practice • Whenever something goes wrong, humans are given back control • Humans are removed from the system because they are not reliable, but they are asked to intervene in the most difficult situations. Bainbridge, 1987

  34. Automation paradox “Manual control is a highly skilled activity, and skills need to be practiced continuously in order to maintain them. Yet an automatic control system that fails only rarely denies operators the opportunity for practicing these basic control skills … when manual takeover is necessary something has usually gone wrong; this means that operators need to be more rather than less skilled in order to cope with these atypical conditions.”

  35. Team Work • Demand for quick response • Conditions Complexity and distribution of technology “Purpose affirms trust, trust affirms purpose, and together they forge individuals into a working team.” ― Stanley McChrystal, Team of Teams: New Rules of Engagement for a Complex World

  36. Video ThyssenKrupp • https://www.youtube.com/watch?v=biNebig1gUI

  37. Systems thinking • Human beings, viewed as behaving systems, are quite simple. The apparent complexity of our behavior over time is largely a reflection of the complexity of the environment in which we find ourselves. (Simon 1996)

  38. User-centered design principles • Early focus on users, activities and context. • Active involvement of users. • Appropriate allocation of function between user and system. • Incorporate of user-derived feedback into system design. • Iterative design.

  39. Standards • MIL-STD-1472G • FAA System Safety Handbook, Chapter 17: Human Factors Principles & Practices December 30, 2000 • ICAO Human Factor Digest No. 11 — Human Factors in CNS/ATM Systems (Circular 249) • ISO 9241-210:2010 • ISO 6385:2004 • CANSO Human Performance Standard of Excellence • CAA CAP 1377 (Automation)

  40. Accident and incidents Although the additional flight crew were a valuable resource, had they not been available the primary flight crew would have likely responded to the situation in a similar manner. However, the gathering of information to assist in decision making would have required the use of alternative resources and methods.

  41. Accident and incidents

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend