ABSTRACT

The availability of sophisticated automated aids in modern aircraft feeds into a general human tendency to travel the road of least cognitive effort (Fiske & Taylor, 1994). Omission and commission errors resulting from automation bias, the tendency to utilise automated cues as a heuristic replacement for vigilant information seeking and processing, have been documented in professional pilots and students, in one and two-person crews. Underlying causes of automationrelated omission errors have been traced in part to vigilance issues. Crews count on automation to provide the most salient and reliable information about flight progress and system status, and often 'miss' events that are not pointed out to them by automated systems. This tendency is exacerbated by the fact that operators often have incomplete or fuzzy mental models of how various modes of automation work (Norman, 1990; Sarter & Woods, 1992). Additionally, the opaque interfaces of many automated systems, which provide only limited information about actual status and criteria for conclusions, make it difficult for even vigilant decision makers to detect errors.