Précédent Suivant

Resilience and Ergonomics in aviation

p. 65-71

Résumé

According to a widely accepted notion, resilience engineering is the ability of complex system to be robust yet flexible. Aviation is a complex system made of technology, environment and people. Each of these systems is complex in itself. According to this paper, the paradigm shift that informed the aircraft design during the early ‘90’s, technology-centered, has lessened the propriety of the aviation system to be resilient. In fact, since the human factor in aviation has been seen as the weak ring of the chain, more and more technology has been introduced on board, to take over a series of tasks traditionally accomplished by the pilots. The resulting opacity of the system, no more thoroughly known by the final users, has diminished the ability of pilots to deviate from established procedures and to create a safety buffer via innovative and creative management of the airplanes. Herein, some aspects will be shown in the human-technology relationship in aviation, to evaluate if some innovation could be considered a step forward toward safety or a step back to a less resilient system.


Texte intégral

1 - Introduction

1Air safety has dramatically improved along the years, but it is cyclically affected from different typologies of accidents that induce the accidents’ curve to rise. We’ll show how the safety paradigms have shifted along the years, and how they have represented initially a solution and eventually a problem. The corrections to the aviation system occurred in the last fifty years have followed the accidents’ dynamics and these have been clarified following the accidents’ investigations. The approach to safety is determined by the glasses we wear when we analyze the accidents’ causes. This paper focuses on the innovation introduced during the early ‘90’s on the airplanes, where technology was seen as an answer to cope with the human fallibility. Moreover, this paper tries to demonstrate how new solutions bring often new problems. In this case, the relationship between pilots and automation will be briefly analyzed to show some setbacks. The result is that more and more technology onboard has induced less flexibility and eventually a less resilient system.

2 - A brief history of accidents

2Most of the corrections to existing systems or procedures, in aviation, were introduced following severe mishaps. So the path of the entire industry has been a kind of “trial and response” dynamics: innovation, mishaps, correction. According to the statistics, the human error has played a pivotal role in the accidents, with a higher rate, compared with other factors as environment (meteorological conditions, Air traffic that induces mid-air collision, and so on), mechanics (i.e.: structural limit exceeded, poor cockpit design) security (high-jacking, bomb onboard, etc.).

3Starting from the ‘40’s, investigators wondered why airplanes crash. Taken for granted that the pilots were the fallible factor in the entire system, someone started to analyze “why” pilots did so many errors. At the beginning, till the mid ‘50’s, the main cause of accident was identified as “Loss of control”. This category includes situations in which pilots lost the airplane control such as: reaching (and exceeding) the structural limits, conditions in which the airplane stalls, overbanks or experiences an unusual attitude that put in jeopardize the flight progress. The root cause of lost control spanned from fatigue, to distraction, to excessive workload, to sleepiness, and so on. Briefly, the problem was identified in the main area of “human performances and limitations”. The solution thought to fix this kind of problems was the engineering, to provide more systems, more aids and more technology.

Image 1.jpg

(Figure 1: Decades on the x-axis, accidents per million take-offs on the y-axis. Source: Flight Safety Foundation)

4The technological approach focused on two sides: innovation of ground-based aids to guide pilots with more precise instruments (i.g.: radar and ILS – Instrumental Landing System) and implementation of new devices onboard to lower the workload (autopilot, Flight Director, Auto-Throttle). The effects of these innovations were successful, since the rate of accident sharply dropped. Nevertheless, during the ‘70’s the accident rate started to rise again, but with a different dynamics. In fact, the main cause of accident shifted from “Loss of Control” to CFIT (Controlled Flight Into Terrain). In this kind of dynamics, a perfectly efficient airplane hit an obstacle in the nearby of the airport when full in control of the crew. Furthermore, we have to consider that most of the accidents happen during the approach phase. The investigations revealed that a poor decision making, a loss in the situational awareness, a conflict (open or concealed) was in progress between the pilots. In short, there was a problem in the human interaction onboard. This time the solution didn’t pass through technology, but applying a new approach, based on psychological assumptions on what is thought to be a good team work. We should mention that, on that period, other new technologies were introduced in the aviation system, but it is generally assumed that the psychological approach was pivotal in improving the system’s safety. Courses of CRM (Crew Resource Management) were implemented in most of the main airlines to enhance the interaction between the pilots (and, later, also between the entire crew, cabin attendants included). The accident curve dropped again, but during the ‘90’s it raised again, even if in a smaller magnitude compared with the past decades. The problem is that the overall volume of the air transport, nowadays, is so huge that even a small amount of accidents (lower than in any other transportation domain such as roads, railway, sea, etc.) could be unbearable for some reasons. Firstly, the human, legal and economic cost of an accident is huge and could destroy an airline’s stability, leading it out of the economic contest. Secondly, an air accident has a worldwide resonance and could distort the real perception of air safety in the public opinion. Whatever the consequences of air mishaps, it is essential to understand why they keep on happening. During the ‘90’s, the main cause of accident shifted once again, as a pendulum, swinging back to “Loss of control”, but in a different shape, compared to the one experienced during the ‘50’s. In fact, today the pilots have so many technological aids that is hard to conceive how they can lose the control of the airplane. Actually, the implementation of so many systems is the consequence of the engineering approach to safety in which the pilots are seen as the weak ring in the industrial chain. So, automatisms are intended to substitute many functions played usually by pilots. There is a widespread opinion among authors studying human factor in aviation that in this case we may talk about “over­redundancy”: too many instruments induce a low workload that could provoke complacency, inadequate training makes the pilots unable to override the automatisms in case of their failure or misbehavior.

3 - A Case Study

5Here is briefly presented a case study illustrating the relationship between pilots and technology, related in particular to the complacency induced by automation. It concerns an accident happened recently, in which automation surprise and unexpected situations played a role in leading to the accident. The flight was a test flight carried out with an Airbus A-320 in France by a very skilled, expert, committed crew, made of test pilots. After having performed a series of tests at high altitude, the crew decided to return to land, planning to perform the stall test (to verify the airplane’s behavior at low speed) just minutes before reaching the base airport. Unfortunately, something happened while the crew was flying at high altitude: the angle of attack’s sensor (which detects an abnormally high attitude, providing minimum speed values to other computers) froze. Here we find the first theoretical issue: the information asymmetry existing between crew and new planes. In fact, the Flight Management Computer processes data coming from lots of sources, among them pilot’s inputs, values collected from other computers and from sensors displaced around the fuselage. When the Flight Management System receives an input from the pilot it matches this request with an internal data-base to verify if is correct; if not, the system replies “Out of range” “Not on data base” “Format error” and so on. On the other hand, the information arriving to the computer from sensors are generally taken for granted. In other word, the computers cannot detect if a probe -providing air pressure information -is reliable, thus processing the incoming information from Pitot tubes and static ports whichever is the provided value. Consequentially, an asymmetry arises between man and machine; the former cannot detect (except in blatant cases) if the computer is providing erroneous information such as computed speeds and protection data. In this case, the calculated stall speed, coming from a frozen sensor, was too low, about twenty knots lower than the real stall speed. This kind of airplane has a flight law that protects pilots from high and low speed, so the pilots relied on these protections and to the indication available on the primary instruments. In addition, something contributed to conceal the ongoing threat: the auto-trim function, a feature that keeps the airplane balanced, moving the horizontal stabilizer, but whose action is not perceivable by pilots. In order to maintain the minimum speed, the attitude increased, while the stabilizer reached its rear limit, giving little margin to the pilots to maneuver out of this unwanted flight condition. It is important to point out that the flight protections in “normal law” start to work according to the indications provided from other computers, but since the minimum speed was (wrongly) calculated to be around 90 knots, there was no feature at all helping pilots. Keeping on slowing down, the airplane started to buffet, to shake, due to the incoming aerodynamic approach to stall. Eventually, the pilots decided to apply maximum thrust to recover from an unusual attitude. Unfortunately, a sudden application of thrust in a wing mounted engine, gives a pitch up moment, increasing the angle of attack and causing the airplane to stall. Due to the insufficient altitude, the crew was not able to recover and, after few seconds, the aircraft crashed.

6What is the lesson learnt from these cases? Excessive trust in automation relaxes the pilot’s check of flight envelope, lower the critique in automatic system behavior, reduce the overall knowledge of what is “behind the screen”. All these elements bring to a diminished flexibility to cope with the unexpected, since the inner logic of systems is often unclear. While in conventional aircraft there was room for deviations in order to reach the intended goal, using the machine even in unusual, unpredicted, unorthodox way (behavior based on a deep knowledge of the systems), we are now facing new challenges. Complacency in computers, diminished flying skills, impaired situational awareness caused by uneasiness for dealing with opaque systems, make the pilots reluctant to deviate from procedures. This trend induces a rigidity that is not suitable to cope with the complexity, the variability, the unforeseeable situations experienced in the real operational life.

4 - Human Factor and Technology

7There are different conceptions of Ergonomics, as emerges from the evolution of the discipline along the years. Initially, ergonomics was conceived as a corrective mean: expert tried to understand how to make system better, after the misuse of something badly designed. Preliminarily, there is a difference between a user friendly concept and a pilot friendly concept, since the first is suitable to a generic user, while the second takes into consideration the cognitive patterns of a qualified operator that is intended to be the final user. Pilots are target-oriented and they need to understand thoroughly the systems they use in order to get their target, optimizing time and performance. According to Erik Hollnagel, the ETTO principle is widely adopted by front-line operators, in order to reach the compromise between thoroughness, precision, accuracy and time needed to carry out the task. In the conventional airplanes, the pilots knew how a system is designed, which is the logic, which are the components and the relative procedures. This knowledge allows pilots to understand what they are doing and occasionally also how to deviate if conditions warrant, inducing a flexibility in the system’s use. In this situation technology and automation are seen as the arm, while the pilot is still the brain, with the big picture during the flight operation. In the new generation airplanes the pilots don’t know anymore what they are using, because of the opacity of the technology. Actually, the inputs given by the pilot are not addressed to the single system but they are processed by some computers. Even the inter-relation between computers makes the system sometimes unpredictable (automation surprise). So a kind of “push-the-button syndrome” is affecting the newly trained pilots, because, as in a lift, they just behave as an extension of the computer rather than the opposite. The emergency procedures as well, are designed not to help the pilot. In fact, sometimes during an emergency procedure time is short and there is the necessity to skip some item to get the job done. The pilot knows how to give priority to some actions, neglecting others. In the new generation of airplanes the pilot must follow strictly what the computer indicates, not only to solve the emergency, but even because the procedural effectiveness can enable the system for the next phase of flight. So, computer set the pace of the operation even during an emergency situation. In this condition a short-cut could be risky. Nowadays, human factor experts are involved at early stage in the design process, to keep the system user friendly. Actually, what is required is the expertise of someone who can translate an engineering necessity in an operationally suitable system. Let’s think about the number display onboard. According to the Gestalt principles, human mind is more concerned about general configuration rather than in analytical vision. This is more than true inside a cockpit, because the number of the displays, the short time available to detect every single variation, the process of interpretation of multiple data. In a pilot’s mind, symmetry is more important than a precise indication.

Image 2.jpg

8Given the same figures, it is obviously easier to spot a difference on the latter display, called “field vision”, versus the “analytical vision” on the former image. The same applies to the speed indicator, such a speed tape, set on a side of the attitude indicator (PFD: Primary Flight Display). They have the great advantage, compared to the older version (analogue indicator) of speed indicator: it can represents also the speed related to the entire operational envelope, such as flaps and slats operating limitations, over speed, approach to stall warning et cetera. The problem, as a philosophy of flight, is that things appear to go better when the workload is low (inducing perhaps complacency) while they go worst when there is a main failure. In fact, all those useful indications are removed from the speed tape, leaving the pilot to strive with a higher mental workload.

5 -Conclusion

9In the aviation domain the pilot is the highest resource available to assure flexibility, because his sound judgment is vital to fill the gap between procedures (which ensure robustness) and unexpected circumstances arising from the operational life. Machines in themselves cannot replace a sound judgment. The problem, at an ergonomic level is to conceive airplanes that are comprehensible by the operators and that take into consideration the final user’s decision making pattern. If we think about an airplane as an Airbus A-320, which has almost 190 computers to manage its systems, we can realize that it has become a kind of eco-system, in which the relationship between input and output is not linear. The real challenge for pilots is “to be into the loop”, maintaining a proper situational awareness, trying to understand the underlying logic of the systems and their interactions even if it is almost impossible to know thoroughly such a complex architecture. What is observable, among pilots, is kind of “fear of automation” and its unpredictable behavior. So training is going towards a more procedural and methodological approach to flight. More and more procedures, standard are at the same time a resource and a threat. The final end users were accustomed to short-cut and heuristics to carry on most of the tasks. If deviating from procedures, in a newly conceived aircraft, is something perceived risky, where does the flexibility come from to make a system more resilient?

Bibliographie

Des DOI sont automatiquement ajoutés aux références bibliographiques par Bilbo, l’outil d’annotation bibliographique d’OpenEdition. Ces références bibliographiques peuvent être téléchargées dans les formats APA, Chicago et MLA.

References

Amalberti et al. (1994), “Vigilance, attention, fatigue”, capitolo 9, in Briefings: A Human Factors Course for Pilots - Reference Manual, Dédale

Chialastri A., Pozzi S., (2008) “Resilience in aviation”, in Computer Safety, Reliability and Security, Springer-Verlag, Heidelberg

Cooper, G.E., White, M.D., & Lauber, J.K. (Eds.) (1980) "Resource management on the flightdeck," Proceedings of a NASA/Industry Workshop (NASA CP-2120)

Dekker, S., Johan Rignér, (2000) “Sharing the Burden of Flight Deck Automation Training”, The International Journal Of Aviation Psychology, 10(4), 317–326 Lawrence Erlbaum Associates, Inc.

Dekker, S. (2001) “Reconstructing human contributions to accidents” Technical Report -01, Lund University School of Aviation

10.4324/9781315238654 :

Dismukes, Berman, Loukopoulos (2008), The limits of expertise, Ashgate, Aldershot, Hampshire

Flight Safety Foundation (2003), “The Human Factors Implications for Flight Safety of Recent Development In the Airline Industry”, in Flight Safety Digest, March-April

10.4324/9781315261843 :

Goeters k.M. (2004), Aviation Psychology: Practice and Research, Ashgate publishing Ltd. Aldeshot Hampshire.

Hawkins F. (1987), Human Factors Inflight, Ashgate Aldershot Hampshire

Hollnagel E., Woods D., Leveson N. (2006) (a cura di), Resilience Engineering – Concepts and Precepts, Ashgate, Aldershot Hampshire

10.1007/978-3-540-87698-4 :

Hollnagel Erik, (2008) “Critical Information Infrastructures : should models represent structures or functions ?”, in Computer Safety, Reliability and Security, Springer, Heidelberg

10.1201/9781315616247 :

Hollnagel Erik, (2009), The ETTO Principle – Efficiency-Thoroughness Trade-Off, Ashgate, Surrey, England

ICAO, (1998), Human Factors Training Manual, Doc 9683-AN/950.

ICAO - Human Factors Digest No. 5, Operational Implications of Automation in Advanced Technology Flight Decks (Circular 234)

IATA (1994), Aircraft Automation Report, Safety Advisory Sub-Committee and Maintenance Advisory Sub-committee

Norman Donald (2005), The psychology of everyday life (tr. It: La caffettiera del masochista),Giunti Editore, Firenze

10.1515/9781400828494 :

Perrow C., Normal Accidents: Living with High-Risk Technologies, Basic Books, New York 1984.

Ralli Marcello (1993), Fattore umano e operazioni di volo, Libreria dell’Orologio, Roma

10.1017/CBO9781139062367 :

Reason James, (1990) Human error, Cambridge University Press, Cambridge

Tichauer E. R., (1978), The Biomechanical Basis of Ergonomics: Anatomy Applied to the Design of Work Stations, New York: John Wiley & Sons,

Wiedemann R., Sulzer R., Raulf H.U., Kutschera S., Buhler J., Ebermann H.J., Hamm F., (2004), Human Factor concepts, Vereinigung Cockpit e.V., Frankfurt am Main, Germany.

Précédent Suivant

Le texte seul est utilisable sous licence Licence OpenEdition Books. Les autres éléments (illustrations, fichiers annexes importés) sont « Tous droits réservés », sauf mention contraire.