ABSTRACT

In Chapter 8 we looked at a range of issues that arise from uncertainty in spatial data, the impact these can have on the analytical products of GIS and by implication on the accuracy of environmental simulations. We also saw how these issues might be managed. Sources of uncertainty can be numerous (Figure 8.6) and can be difficult to disentangle. In Chapter 8, the influence of operational uncertainty was played down to concentrate on data issues. In a coupling of GIS and environmental simulation modelling (Chapter 7) operational uncertainty can derive from the nature of the algorithms in GIS and from the operation of the simulation model: choice of spatial discretisation and time increment, fixing of parameters, algorithm choice and calibration. These issues together can be classed as modelinduced uncertainty. According to Burrough et al. (1996):

information conceptual models data (9.1)

and that the link between quality of information, models and data can be expressed as:

quality [information] f {quality [model], quality [data]} (9.2)

where ‘model’ encompasses both GIS and environmental simulation. The issue should not be underestimated. We have already seen the tension that exists between natural variation in the real world and the data models that we use in GIS and simulation models. Thus data sets are only approximations of reality. Simulation models are also approximations as ‘very few earth science processes are understood well enough to permit the application of deterministic models’ (Isaaks and Srivastava, 1989). And yet even if we wish to strive towards perfect models, how would we know when we have one since ‘verification and validation of numerical models of natural systems is impossible’ (Oreskes et al., 1994). The answer to this dilemma does not necessarily lie in stochastic models since there are consequent problems in identifying appropriate probability distributions for all the parameters and then there is still the chestnut of verification. But if we are

modelling in the search for engineering solutions rather than purely for the pursuit of science, then we should take a sufficing approach in which the quality of the information need only be dependable in that it contributes towards an effective solution to some problem (Chapter 8). So there is another variable to equation (9.2) – professional judgement. But this cuts two ways, first in interpreting the significance of the analytical products that are the outputs to the environmental modelling process and second at an earlier stage in the very choice of data, data processing algorithms, type of simulation model, setting of parameters, achieving an acceptable calibration and so on. Equation (9.2) appears too deterministic as a good quality model, and good quality data in the hands of an inexperienced modeller may not give dependable results. I would therefore propose that:

dependable [information] f {quality [model], quality [data], experience [professional]} (9.3)

where residual uncertainty (but not necessarily random error). In this chapter we will be looking at equation (9.3) from the perspective

of the right-hand side of the equation. Given the plethora of environmental simulation models used by a wide range of disciplines in a large number of situations, I would not be so presumptuous as to evaluate their performance and arrive at a critique of their quality. I leave that to each discipline to establish the means of consensus on the usefulness and applicability of its models. Users should make themselves aware of the assumptions and limitations of models before using them in support of decision-making. Data quality issues have already been covered in Chapter 8. What does need to be looked at here are issues around models where professionals need to make choices. By the end of the chapter, I will tie this in with the data quality debate. In Chapter 10, then, we will be looking at equation (9.3) from the perspective of the left-hand side, that is, making decisions by dealing with any risk in the residual uncertainty.