ABSTRACT

Data normalisation is the process of making reasonable adjustments to the data in order to make fair and equitable comparisons with other data under the same or similar conditions. The process of normalisation may be referred to as 'benchmarking', implying that the elements to be compared are being measured from the same benchmark. The level playing field is a reflection of the need to allow each data point to 'compete' on an equal footing as all other data points. Before we use any data for estimating, we really should understand its pedigree. Data for estimating can emanate from a number of different sources, too many to mention here other than in the generic sense, which we will classify as Primary, Secondary or Tertiary Data. Even currency values that are generally considered to be common across a number of countries may need to be normalised across international boundaries for the same points in time.