ABSTRACT

When continuous phenomena are monitored by discrete observations, a complex process needs to be performed in order to produce data in a form that allows for interpretation on a higher level. In the case of a geostatistical approach, this process includes the determination and application of the statistical properties of the observed field. Any monitoring system has to deal with limited resources like the number of possible observations that can be carried out, the processing capacity and the storage space needed for archiving. These issues are addressed in this chapter. A heuristic is proposed to estimate the density of observations that is necessary to represent the phenomenon appropriately. A sequential approach is introduced to mitigate the computational burden of kriging. Furthermore, a compression algorithm is introduced that reduces the volume of observational data. Apart from such concrete approaches to optimize the monitoring process, the architectural framework provides tools to evaluate different approaches and configurations by simulation. The principle is to generate a continuous random grid, apply observations on this grid from which an interpolated grid is derived. The deviation between the synthetic random field grid and the interpolated grid is an indicator for the quality of the monitoring process as a whole. A generic approach to quantify the necessary computational workload is also proposed. Combining these features with a module for systematic variation and evaluation of methods and parameters constitutes the architectural framework for incremental improvement as introduced in this chapter.