Peter B. Ladkin

Article RVS-J-97-10

Human beings are flexible, inventive and adaptable. Even in error. Certain kinds of human error are resilient enough to overcome technology specifically designed to avoid them.

Consider the downing of Korean Air Lines Flight 007 over Sakhalin island in August 1983. The 1993 International Civil Aviation Organisation report says that the identification was bungled: `` ...the pilot of one of the USSR interceptor aircraft...had been directed, by his ground command and control units, to shoot down an aircraft which they assumed to be a United States RC-135''. While there remains considerable uncertainty about many aspects of the accident, this is one of the most well-substantiated, in part by recent interviews with the pilot. Why was such a grievous mistake made?

Cognitive psychologist James Reason uses the term confirmation bias for the partly unconscious ability to value evidence that confirms a hypothesis, no matter how wrong-headed, while ignoring evidence that contradicts it. This was a very low-tech incident. Identification was made visually. Interceptor pilot and controllers were expecting a military aircraft; identification was incomplete and the decision process was rushed; the pilot perceived what he thought were evasive manoeuvers; sensitivity could have been heightened by nearby secret military tests; no commercial flight should have been within hundreds of miles. Clearly, extra electronics would have helped to improve identification - or so one might think.

Or would they? In July 1988, the USS Vincennes, an Aegis-class warship, bristling with electronics to manage a complex wide-area air battle in open ocean, shot down Iran Air Flight 655 on a scheduled flight to Abu Dhabi. The Vincennes was fighting small Iranian boats, and made a high-speed manoeuver that caused chaos in the fighting systems control room. IR655 was off-schedule, and flying towards the fight. It was first identified as an F-14 attack aircraft, on the basis of a momentary F-14-compatible transponder return. The crew then experienced a form of confirmation bias: the transponder return was consistently thereafter misreported, and the aircraft was misperceived as descending towards the Vincennes, whereas it was in fact climbing. The report shows that the decision to fire was made notwithstanding persistent contrary evidence from the electronics, concluding that `` stress, task-fixation and unconscious distortion of data may have played a major role...'' We can see similar cognitive features in this case - expectations, urgency, a sensitive military situation, (mis?)perceived manoeuvering of the suspect aircraft, confirmation bias, ultimately misidentification. So much for sophisticated electronics solving the identification problem.

Now to a current theme. More than half of the 2,200 airliner fatalities during 1988-1995 were due to controlled flight into terrain (CFIT), in which the pilots are unaware of imminent collision. Most CFIT cases happen on approach to an airport, and usually involve human error. Airline accidents on nonprecision instrument approaches (NPAs) occur with a frequency five times greater than on precision approaches. CFIT is a big killer. So what to do?

Put in more electronic helpers. For example, Boeing 757 pilots are trained to use the flight-management system (FMS) to determine position and course. But this can also go wrong. The report on the CFIT crash at Cali in late 1995 included as causal factors an FMS database ambiguity and an FMS function that caused pertinent course information to be erased which would likely have highlighted a misapprehension by the pilots. No wonder the FMS didn't help. Connoisseurs also see evidence of confirmation bias in the crew's behavior and communication with air traffic control.

But some electronic helpers seem to be almost foolproof. A solution has been proposed to CFIT in the form of electronic equipment called an Enhanced Ground Proximity Warning System (EGPWS) to warn pilots of dangerous terrain ahead of the aircraft. Its predecessor, GPWS, looked down but not ahead, has been in use in the U.S. for 20 years, and seems to have helped. It didn't help at Cali, and a recent model didn't help in Guam. But the enhanced version must surely be much better. How could it fail?

One can take a cue from the downing incidents. What could go wrong is what won't change. An airline-pilot colleague who has written FMS handbooks summarises the views of many professional pilots: `` Shooting an approach is generally easier in a steam-gauge airplane that in a hi-tech airplane. Less training, less monitoring, less information to sort.'' More high-tech devices will not alleviate this particular situation, no matter how wonderful they seem.

There appears to be near-unanimity in the aerospace industry on the value of EGPWS. But note that it's treating the symptoms, not the cause. How can we judge how well EGPWS will work, unless we thoroughly understand CFIT? And that's a question of knowing about human error, not of fancy technology.