The Federal Aviation Administration (FAA) and the National Transportation Safety Board (NTSB) identify, investigate and analyze data collected from aviation accidents to determine the root cause. It has been determined that most aviation accidents do not happen in isolation; rather, they are caused by a series of events, most of which are avoidable.
Earl Weiner’s Aviation Automation Research Unveils Hazards in Human Factors
Earl Wiener, leading NASA’s Ames Research Center, is a pioneer in human factors and automation research in aviation. Since the 1970s he has been researching the intertwining roles of automation in aviation with pilot errors and accidents. In his research, Wiener identified several disturbing cases where automation created an environment that allowed pilots to be complacent and careless, leading to delayed decision making in dangerous and emergency situations. One of the most troubling cases Weiner reviewed was the 1983 crash of Korean Air Lines Flight 007, which was shot down by the Soviet Union after it veered hundreds of miles off course. Later official reports would site the crew’s “lack of alertness” as the reason for the tragic incident that killed all 269 people on board when the plane plummeted into the Sea of Japan.
As aviation manufacturers expanded technological improvements and specialized automation functions in airplanes, pilots were being freed of their responsibilities and were becoming bored behind the wheel. Instead of improving the rate of accidents due to human error, piloting was becoming more hazardous. Aviation automation improves flight path coordination, eliminates certain human variables, offers economical utilization of machines and increases productivity. It was designed to prevent mistakes brought on by pilot fatigue, inattention and other human errors. Automation of aviation is intended to offer pilots the opportunity to make better strategic decisions and gives pilots more time to focus on details.
Automation in Aviation Offers False Sense of Security
Some pilots are not using automation technology appropriately and instead are relying too heavily on it without scrutinizing the data, are missing silent failures and are accepting a false sense of security without monitoring the automated systems more carefully. Weiner interviewed pilots to identify the underlying issues. One pilot admittedly reported, “I know I’m not in the loop, but I’m not exactly out of the loop. It’s more like I’m flying alongside the loop.”
Weiner had envisioned the aviation industry making slow and careful innovations calibrated to pilots’ abilities and needs. Instead, aviation companies increased automation to address particular errors but not necessarily in accordance with the needs of the pilots. Later in 1990, Stephen Casner added a new perspective to Weiner’s research. A graduate of the University of Pittsburg, Casner’s doctorate was in Intelligent System Design, for which he explored the human psychological aspect of automation in aviation. He came to similar conclusions after conducting his own research for two years.
Casner found that cockpit systems were not well understood by the pilots who used them. This is especially the case with junior pilots who rely heavily on the automation system. It was also discovered that the amount of automation a pilot relied on during a flight directly affected the way the pilot performed his or her own work with the flight system in conjunction with the automation technology. About 21 percent of the pilots surveyed were not using their additional time to focus on higher-order computations; rather, they spent their time on commonplace issues, and some even lost the sharpness of their skill set.
In a recent example, pilots heavily relied on the automation system to make a strategic judgment call that cost three passengers their lives and injured 187 more on board when Asiana Airlines Flight 214 crashed in an attempted landing in San Francisco in July 2013. All four pilots on board missed the crucial judgment call that the plane was coming in too slowly. Because they relied on the automation system to tell them if something was wrong, they believed the plane would keep flying at a disastrously low speed. The NTSB cited the lack of training on the Boeing 777’s new complex automation system and the pilots’ lack of judgment in landing the plane as direct causes of the fatal accident. The recent missing Malaysia Air Flight was the same Boeing-manufactured 777 airplane.
The false sense of security among pilots who count on aviation automation to safely get the plane to the destination creates hazardous conditions. Without hands-on directives, some pilots fail to maintain or develop the skill to determine the best course of action. Pilots should utilize the automation system in conjunction with their sharp skill set to proactively enhance their abilities to determine the safest flight conditions. When pilots fail to use the automation system in combination with complex cognitive decision making, innocent passengers and crew members may be at risk.