An Analysis of the Human Error Element in Aviation


Human error is said to be responsible for 75% to 80% of aviation accidents. It is one of the top causes of aviation disasters; yet, many have resorted to assigning blame on the people who made mistakes instead of trying to find out why they made such mistakes and try to find solutions to the causes of these mistakes. Human error will always be there; it is a given fact. Yet, we can choose to work with this fact and improve the standards of technology and procedures to come up with solutions that can minimize the damage caused by human error. It is not an easy task, nor was it meant to be. But if we so choose, we can turn this apparent weakness into our greatest resource. There is and always will be a way to utilize our humanity to bring about amazing solutions to the problems of aviation in the light of human error.

An Analysis of the Human Error Element in Aviation


Human error has played a major role in 75% to 80% of aviation accidents and disasters. When we say “human error”, however, we not only refer to pilots; but to maintenance personnel, tower crews, supervisors and just about anyone involved in the safety of the aircraft. The fact that humans play a huge part in aviation accidents should also indicate that humans are also the key to minimizing this.

Academic anxiety?
Get original paper in 3 hours and nail the task
Get your paper price

124 experts online

The fact is that people are inaccurate by nature. Human strong points, basically, lie in our flexibility, creativity and adaptability. In fact, the same things that make us creative and flexible also make us error prone. Machine-like precision has never been our strong point as human beings. Yet, humans are the single most important aspect of aviation safety.

Since humans still control the aircrafts, no flight is completely error-free. We simply have to admit that a mistake-free flight is, as of now, impossible. Daniel Cilli (2008), of the Southern Region Runway Safety Program Office, said:

“Eliminating human errors is an unrealistic goal since errors are a normal part of human behavior. So the second approach to the controlling of human error is to reduce the consequences of those errors that will occur.” (Chilli, Human Factors and Runway Safety, 2008)

It is my objective, through this paper, to discuss the elements of human error in general, how human error affects aviation, how to maintain “situational awareness” in aviation, and How to reduce human error in aviation. Perhaps, by discussing this, we may significantly reduce or, hopefully, eradicate human error in aviation.

To Err is Human

In the book, An Introduction to Human Factors Engineering, human error is defined as, “inappropriate human behavior that lowers levels of system effectiveness or safety, which may or may not result in an accident or injury” (Wickens, Gordan, Liu, 1998). In aviation, human error usually describes operator error, or the mistake or mistakes of the person or persons directly working with the system. However, human error is not completely responsible for all the “mistakes” that happen in aviation (or anywhere for that matter). The reason statistics on human-caused aviation accidents are so high is due to our nature to assign blame. “It is in the interests of the company to blame the worker rather than admit deficiencies in their procedures product or system” (Shealey, 1979). Human or operator error is a very common excuse when it comes to aviation accidents; however, studies by Shanders & Shaw (1988) have revealed that human error was never the only factor in an accident.

Again the book, An Introduction to Human Factors Engineering, mentions, “Rasmussen’s SRK model (skill, rule, knowledge) describes three different levels of cognitive control that might potentially be used by a person during task performance” (Wickens et al, 1998). Simply put, it means that highly experienced people will perform their skill-based tasks at an automatic, subconscious level; while inexperienced people will perform their task only at a rule-based level. However, in unique situations, experience becomes a virtual non-issue; and they will have to function simply on their own analysis based on the given information. This is usually where errors happen.

We are familiar with two types of Human Error Categories; omission and commission; however, Swain and Guttman (1983) proposed four Human Error Categories, expanding on errors of omission and commission:

  1. Error of Omission
  2. Error of Commission
  3. Sequential Error – Performed actions out of the correct order.
  4. Time Error – Performed actions too slow, too fast or too late.

During certain studies, it has also been discovered humans commit a mistakes from between 0.5% to 1.0% of the time in any given task. Chedru & Geschwind, (1972) cited that even the best writers would make grammatical errors 1.1% of the time they are writing. And according to Potter (1995), pilots, when making entries into an aircraft flight management system, shockingly make errors 10% of the time; and this gets even higher when faced with heavy workloads. Figures such as these underscore the importance of studies in this area of aviation.

Aviation and Human Error

Compared to driving a car, the competing tasks that pilots must perform are far more complicated. Although a lot of the competition is visual, a great deal still has to do with the perceptual, cognitive and response-related resources. Depending on aircraft type, weather conditions, etc., the pilot’s work can go from under-load to extreme overload at any time.

While error on the part of pilots is indeed a possibility, we cannot discount the fact that, “there is a growing appreciation that designers, manufacturers, corporate decision makers, and maintenance managers can make mistakes, which, in turn, create the conditions promoting errors on a flight” (Reason, 1990). These trends have resulted in more human-related errors than component-related failures.

Though it may have been somewhat self-serving, Boeing’s 1997 “statistical summary of worldwide commercial jet airplane accidents” shows that 71.7% of aviation accidents from 1987 to 1996 were due to cockpit crew rather than airplane maintenance, weather, and other factors. Nevertheless, Boeing’s report clearly shows the importance of controlling human-related errors. It would also be wise to take note that human factors in maintenance or airport traffic control (ATC) contribute to more than 10% of aviation accidents. Many of them are caused by the operators’ lack of attention. Boeing’s report points out that, human errors in aviation can most likely lead to tragic loss, while manufacturing errors would rarely be disastrous.

Developing and Maintaining “Situation Awareness”

According to the Enhanced Safety through Situation Awareness Integration (ESSAI) consortium, Situation Awareness is, “the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future”.

Operating complex systems, such as aviation and aircraft systems, not only depends on knowing what to do and how to do it, but also when to do the task. According to Ellen Bass (1996), to become proficient, an operator of a complex system must know several types of knowledge: “declarative knowledge (what to do), procedural knowledge (how to do it), and operational skill (when to do it)” (Bass et al, 1996). As such, training for an aviation type of environment has to focus on declarative knowledge, procedural knowledge, and on the integration of that knowledge into operational skills. Operational skills are absolutely vital in a dynamic environment like aviation. Here, the operator needs to make critical decisions about what to do and when to do it. In order to make decisions of this magnitude in a risky and ever changing environment, operators must be able to develop and maintain “situation awareness”.

More than any other aviation personnel, pilots are tasked with some of the most critical decisions to make in, possibly, the riskiest and most dynamic environment. Much of the information necessary for safety is not directly visible to the pilot in its “special” form; rather, “the pilot must depend on an understanding or awareness of the location and future implications of hazards, relative to the current aspect of the aircraft; that is, the pilot must achieve an awareness of the situation” (Wickens et al, 1998). The lack of situation awareness has been blamed for a significant number of accidents in complex and dynamic systems. Technological improvements and automation have been developed to aid the pilots’ situation awareness; however, these technologies were unable to completely eliminate these types of human errors. This only proves that the technology is only as good as the user; and that to train operators to acquire and maintain situation awareness is the key to aviation safety.

Although we already know that situation awareness in operators is the key to aviation safety, not one single acceptable program to develop situation awareness has actually emerged. Being a state of knowledge directly relating to the elements of a dynamic environment, situation awareness, although separate from the decision-making and performance processes, is intricately associated with both the decision-making and performance processes. To this day, acquiring and maintaining situational awareness has been imprecise, at best. We know that it is enhanced by operators’ internal long-term memory structures that enable processing of large amounts of information required in complex situations; yet, we have not achieved progress in developing this idea.

According to Ellen Bass (1996), the reasons for not recognizing an impending situational awareness problem typically fall into three general categories:

  1. System Data Problems – Despite the careful and increasingly user-friendly design of modern computer to human interfaces and systems, human and computer interaction has, sadly, remained imperfect. Often, it takes time to analyze data and utilize them for a given situation; and in critical times, this delay in data processing often leads to panic and, eventually disaster.
  2. Human Limitations – No two human decisions are 100% alike; and no two humans will react the same way to the same stimulus. Humans are limited by their histories and their emotions. Unfortunately, it is not yet possible to engineer human solutions to situation awareness related problems. No matter how advanced aviation technology gets, human limitations will always be a gray area when deciding on critical and dynamic situations.
  3. Time-related Problems – The dynamic state of data also becomes a problem when we talk about situation awareness. This situation makes it more difficult to detect potentially dangerous system states because data monitoring and interpretation has to be exercised repeatedly over time. This further reduces the likelihood of properly detecting dangerous data states. As such, monitoring certain data among all the system states at that time complicates the problem

Minimizing the Possibility of Human Error

Reducing the instances of human error in aviation is no easy task; but, if we are willing to commit to this, the first thing we have to change is our attitude. Predicting how an individual will react in any given situation is guesswork, at best. However, it is possible to assume the worst case scenarios for a given situation. If we base our human error forecasts on the worst possible scenario, it may be possible to create solutions that will avoid these scenarios.

It would be a good idea to try to identify the system characteristics that lead to errors and then try to modify the design. We can either to eliminate the situation or, at the very least, minimize its impact for future events. We can start minimizing human error by:

  1. Eliminating the term “human error” from aviation terminology and vocabulary. This way, we take away the mentality of assigning blame to individuals.
  2. Developing “design specifications that consider the functionality of the human with the same degree of care that has been given to the rest of the system” (Norman, 1990).

Christopher Wickens (1998) suggests that human error and their negative consequences are decreased in three ways:

  1. System design – Here, human errors may be minimized by making it difficult, if not outright impossible, for a person to commit an error, or by making the system error-tolerant, meaning the system is primed to adjust to worst case scenarios. Error tolerance can also be achieved by utilizing feedback methods and by monitoring actions for potential errors. Design features can also be included to reverse erroneous actions by users. The goal is to minimize, if not totally eradicate, risk through system design. System designs should be made simple and easy to understand, simply because reliability goes down as complexity goes up.
  2. Training – This should always bring individuals to the extreme. Trainings should be designed in such a way that all possible bad scenarios be included in the training course. By doing this, we, more or less, condition the trainees’ minds to expect these extreme scenarios so that when they do happen, the operator is ready. There should be no let up in training. The operator must be allocated time to train every month, at least.
  3. Personnel selection – The selection and hiring process should be strict and rigid. We should remember that we are dealing with people’s lives; therefore, personnel should go through the proverbial “eye of the needle” before they can be allowed to take on critical and potentially risky tasks.

Training and Personnel Selection are important factors; however because humans make mistakes, this is still not 100% full-proof. Even best trained pilots and operators will make mistakes. So, along with training and selection, we have to have a very good “error management” system. This type of system has developed over the past two decades in order to help solve human error problems; however, this system needs to be updated every time new input is available. While we must accept the inevitability of errors, we must, nevertheless, strive to maintain performance standards. Error management demands that we “distinguish between an individual being reckless or showing a disregard for the rules, and mistakes that are simply the product of human limitations” (Ragman, 1999). This represents a fundamental shift in aviation philosophy from “excellent airmen commit no errors” to “excellent airmen commit, recognize and resolve errors.”

More about Error Management

The basic premise of error management is that human error is universal and inevitable; and therefore, views human performance as a two-sided coin; human performance and human error. The coin’s two sides are permanently linked to each other. One cannot exist without the other. Error is universal. Error is inevitable. The second premise of error management is that error does not, has not, and will not cause an incident, an accident, or a fatality; because consequences cause incidents, accidents, and fatalities. While error is universal and inevitable, consequences aren’t. This is sound logic. Errors happen all the time; while incidents, accidents, and fatalities do not.

Error management resides in the gap between errors and their consequences. In aviation, error management dictates that any attempt to address flight safety, which does not acknowledge universal and inevitable human error will not succeed. This places errors in a different light. It no longer assigns blame to the crewmember. It assumes technical proficiency; meaning technically proficient crewmembers will commit errors, while incompetent crewmembers shouldn’t be flying airplanes.

Viewing the principles of error management, we can, therefore, say that the training for pilots should be more inclusive. Robert Cohn, a pilot, in his book, said:

“In retrospect, I was short-changed. When I thought more about it, I realized that I had never been taught or even made aware of many of the things that are crucial to the safe and proper use of an airplane. I had to learn those the hard way.” (Cohn, They Called It Pilot Error, 2001)

Cohn would relate that he had never had any training on most physiological, mental and purely human factors that can seriously detract from safe flying.


In this report, I discussed human error, human error with regards to aviation, situational awareness, minimizing error and error management. I wanted readers to see human error, especially in aviation, in a different light; in so doing, I had hoped that we could all benefit from learning wher this thing called error comes from and how we can reduce it.

From Meister’s Types of Failures to Swain & Guttman’s probabilities of error, we may have many different definitions. But one thing, however, is clear; human error will never be eliminated. All humans make mistakes; which is part of the beauty of being human. Instead of assigning blame, we should, instead, determine why an error occurred and how best to prevent it.

Errors, too, are an inevitable part of the aviation industry, which was one of the first to embrace human factors research in order to reduce pilot error. We know now that no matter how good a pilot’s training is, we can never eliminate all errors.

We learned the value of situational awareness and that it is sorely lacking in today’s aviators. We now understand that situational awareness is the perception of the elements in the environment within a volume of time and space, the comprehension of their meaning, and the projection of their status in the near future; and that situational awareness problems can be categorized into system data problems, human limitation problems, or time related problems.

We have perceived that human error and their negative consequences can be minimized in three ways, system design, training, and personnel selection. We now know that personnel selection and training, though extremely important, are not full-proof solutions; and that even the best-trained pilot will still make mistakes. We have seen how focusing training on error recovery, rather than error prevention will make a more successful pilot.

Along with human error comes the human possibility; and it is boundless. Humans leaned to fly. We learned that in flight, we are not perfect. Yet, we are our own solution. And if we are to make the skies safe, we must remember the human element will still be the key to aviation safety.


  1. Bass, Ellen J., Zenyuh, John P., Small, Ronald L., Fortin, Samuel T. (1996). A Context-based Approach to Training Situation Awareness. In Proceedings of HICS ’96 -Third Annual Symposium on Human Interaction with Complex Systems, Los Alamitos, CA: IEEE Computer Society, pp. 89-95.
  2. Burchell, Bill (2000). Human Factors: Still Trying to Catch On Despite ample evidence supporting the value of human factors training and awareness, the aviation industry has been slow to embrace the concept. Overhaul & Maintenance, VI, 10, 21.
  3. Cilli, Daniel (2008). Human Factors and Runway Safety, The Southern Region Runway Safety Program Office;
  4. Cohn, Robert L. (1994). They Called It Pilot Error: True Stories Behind General Aviation Accidents. TAB Books, New York, New York.
  5. Gero, David (1999). Military Aviation Disasters. Haynes Publishing, Newbury Park, California..
  6. Heerkens, Hans (2001). Safety by design. Interavia, 56, 656, 45-47.
  7. Nader, Ralph and Smith, Wesley J. (1994). Collision Course: The Truth About Airline Safety. TAB Books, BlueRidge Summit, Pennsylvania.
  8. Naval Postgraduate School (1998). Human Factors Checklist. Aviation Psychology, School of Aviation Safety, Summer.
  9. Norman, Donald A. (1990). Commentary: Human error and the design of computer systems. Communications of the ACM, 1990, 33, 4-7.
  10. Phillips, Edward H. (1999). Accidents Raise Issue Of Pilot Psychological Testing. Aviation Week & Space Technology, 151, 21, 43.
  11. Ragman, J.T. (1999). Error management. Flying Safety, 55, 8, 12-15.
  12. Reason, J., (1990). Human Error, Cambridge University Press.
  13. Tullo, Frank J. (2001). Responses To Mistakes Reveal More Than Perfect Rides. Aviation Week & Space Technology, 154, 21, 106.
  14. Wickens, Christopher D., Gordon, Sallie E., and Liu, Yili (1998). An Introduction to Human Factors Engineering. Addison-Wesley Educational Publishers Inc., New York, New York.

This essay was written by a fellow student. You may use it as a guide or sample for writing your own paper, but remember to cite it correctly. Don’t submit it as your own as it will be considered plagiarism.

Need a custom essay sample written specially to meet your requirements?

Choose skilled expert on your subject and get original paper with free plagiarism report

Order custom paper Without paying upfront

An Analysis of the Human Error Element in Aviation. (2016, Aug 08). Retrieved from