ABSTRACT

The entropy is a function, which measures the amount of information contained in a sample space. We claim that the entropy function is a measure for the average amount of information per letter generated by this source. The whole idea behind the definition of entropy is in the last property of Theorem. It says that the average amount of information contained in the choice between certain symbols according to a given probability distribution is not changed when the information is revealed in various steps. As before interpret the space as a source of information, where at every point of time one of r symbols is generated according to the probability distribution. This means that if we pick a long signal at random, then we are almost sure that the relative frequencies of the letters are close to the probabilities.