ABSTRACT

In the past decade, automated decision aids and expert systems have been implemented in domains such as medical diagnosis, nuclear energy plant operations, sales and scheduling support, and, in the aerospace domain, glass cockpits and Air Traffic Management. Expert decision makers are being exposed to and are required to utilize these aids as they do their jobs and make necessary decisions. In this chapter, I discuss several of the myths, or misconceptions, that are typically associated with human expertise, and with expert systems and automated decision aids. These have been derived from the literature on expert systems and naturalistic decision making, and from interviews with glass-cockpit pilots and instructors, system operators in the aviation domain. The intent of the discussion is not to diminish the importance of automated aids, but rather to address several areas in which common perceptions of these systems and of the users they are intended to serve do not coincide with reality, and to suggest directions for research on how automated systems might best enhance the expert decision-making process. Although most of the anecdotal examples come from aviation, the issues are relevant to any naturalistic domain in which human expertise is being supplemented or replaced by automated systems.

Consider the following scenario: You are the veteran captain of a major passenger airline, flying a new-generation, two-engine aircraft out of San Francisco. Prior to taxi, you were advised that birds have been sighted at the departure end of the runway. As you leave the ground and begin to climb, the fire handle for your left engine illuminates briefly, then goes out. At the same time, the message "ENGINE FIRE" appears on the Electronic Warning System, and the electronic "ENGINE FIRE CHECKLIST," triggered automatically, outlines the steps for # 1 engine fire, including engine shutdown. The engine indications, displayed on the center panel, show the # 1 engine deteriorating for a few moments, then recovering. Readings for the #2 engine, however, show that it is deteriorating rapidly, suggesting that it is actually the more severely damaged engine—but nothing about this engine shows up on your electronic aids. You need to go back and land at the same point from which you took off (Mosier, Palmer, & Degani, 1992).

Quite probably, you have ingested some of the birds from the runway into one or both of your engines. This is the dilemma that you face: All of the cues that you have traditionally used to diagnose engine problems—that is, engine parameter readings, engine fan and core vibration, and so forth—are telling you that the left engine is recovering strength and the right engine is deteriorating. However, your electronic decision aids, which have been implemented primarily to help you make quick, accurate decisions, are instructing you to shut down the "fire damaged" engine, leaving you with only one marginally operative engine. What do you do?