ABSTRACT

The conventional “enlightened engineering” model conceives public risk management as working rather like a thermostat-in what control theory terms a “closed loop” system. For convenience, that approach to risk management is here labelled as “SPRAT” (to stand for “social precommitment to rational acceptability thresholds”). For SPRAT, the emphasis is laid more on rational decision methods than on other dimensions of institutional design. The underlying design problem is for “society”, guided by the best available scientific consensus, to somehow settle on the appropriate settings for the risk engineers to programme into the “thermostat”. That is, critical levels of risk acceptability or tolerability (with all the perplexing value-of-life conundrums that risk-benefit analysis produces) need to be specified, so that levels of public risk can be kept within satisfactory bounds. To be counted as “rational”, such settings need to rest on scientific and bureaucratic norms, such as toxicological margin-of-safety conventions, risk-benefit analysis or imputed risk tolerances arrived at by the Chauncey Starr (1969) method of inferring general risk acceptability by “reading across” from cases such as driving or smoking, where risky activities are undertaken by large numbers of people. It is the job of the “engineers” (through dose-response experiments, construction of fault trees or analysis of historical data) to ensure that the thermostat is capable of detecting all the conceivable sources of risk and that the mechanisms for corrective action operate smoothly.