ABSTRACT

An important idea of mathematical information theory is that of the 'uncertainty' of an experiment for whose outcome a discrete probabilistic distribution is given. 'Uncertainty' means here the manner in which the distribution differs from a 'certain' distribution, that is to say, from a distribution that attributes to one possible outcome of an experiment the probability 1 and to all other outcomes the probability 0. Clearly, the distribution furthest removed from this will be the uniform distribution - the one that distributes to each of the n possible outcomes (we are dealing with finite distributions, that is, with a finite n) the probability lin. This shows that, as long as we interpret 'probability' objectivistically, the 'uncertainty' of the outcome of an experiment, or of a probability distribution, will also have to be interpreted objectivistically. A very useful mathematical measure of this 'uncertainty' introduced by Shannon has, interestingly enough, exactly the same mathematical form as Boltzmann's expression for entropy. This is unexpected, but intuitively understandable; for both can be interpreted as probabilistic measures of disorder. A random sequence of Os and 1s in which both have the probability 0.5 will be more disordered than a random sequence in which the probability of 0 equals 0.9 and that of 1,

accordingly,0.1;forthelatterwillconsistofmanyOs,withonlyhereand therea1ortwo.128Thismaybeshownbythetwosequences:

Clearlythereisasenseinwhichthefirstoftheseismoredisordered thanthesecond.