ABSTRACT

Automated decision aids and expert systems are being implemented in real-world, high-risk domains such as medical diagnosis, nuclear energy plant operations, engineering design, and, in the aerospace domain, in glass cockpits and in air traffic management. Human decision makers are being exposed to and required to utilize these aids as they do their jobs and make necessary decisions. Automated systems are advertised as functioning just like expert human decision makers. E. L. Wiener and Renwick Curry foreshadowed the tendency of human decision makers to over trust automated decision aids when they discussed a phenomenon they termed "primary-backup inversion," a reference to the tendency of flight crews to utilize warning systems as primary indicators of problems, rather than as secondary checks. To build a strong relationship between human decision makers and automated decision aids, attention needs to be given to design issues on the one hand and human psychology on the other.