ABSTRACT

Humans in systems, example, operators, and maintenance people are essentially alike and are, in general, adaptive and proactive. These are admirable qualities, but of limited scope. Adaptive and proactive behaviors can change systems continuously, but humans at the front end alone may or may not be able to recognise the potential impact that the changes can have on the system, especially the impact when several changes are put into effect simultaneously. Humans at the back end, example, administrators, and regulators tend to have sanguine ideas such as that the system is always operated as planned, and that rules and procedures can fix the system at an optimal state. Mismatches caused by these two tendencies constitute latent hazards, which may cause the system to drift to failures.