ABSTRACT

The Director of the US Defense Threat Reduction Agency (DTRA) made the following statement months before the 9/11 attacks:

We have, in fact, solved a terrorist problem in the last twenty-five years. We have solved it so successfully that we have forgotten about it; and that is a treat. The problem was aircraft hijacking and bombing. We solved that problem … The system is not perfect, but it is good enough … We have pretty much nailed this thing. 1

The DTRA Director was obviously not the only person who failed to imagine the possibility of suicide pilots flying a fully fuelled jetliner into New York skyscrapers. But, then again, why would anyone have any reason to imagine for a second that someone educated enough to fly a plane, and have everything in their lives that comes with that privileged position, want to kill himself and thousands of other people? The important distinction between risks (probabilities that can be estimated) and uncertainty (probabilities that are unknowable) is relevant here. “We now know that the risk of a successful terrorist attack on the United States in the summer of 2001 was great,” Posner explains, “yet the risk could not have been estimated without an amount and quality of data that probably could not have been assembled.” 2 Officials can't accurately calculate the probability of terrorist attacks, so they spend much of their time estimating unknowable probabilities. Systemic uncertainty dominates the homeland security environment. Unfortunately, the prevalence of uncertainty means that “failures” are often required to establish previously unmeasurable risks – we need a 9/11, for example, to acquire a much clearer understanding of the real risks to airport security.