ABSTRACT

When mathematicians and engineers talk of “information theory,” they usually have in mind either Shannon’s or Kolmogorov’s theory of information (the latter also being referred to as “algorithmic information theory”). We’ll examine both theories in this chapter. Shannon’s theory of information attempts to characterize the transmission of character strings across a communication channel where the characters derive from a finite fixed alphabet (such an alphabet can include numerals, e.g., an alphanumeric character set). The alphabet is assumed to have at least two distinct characters. In case there are exactly two characters, these are typically represented by “0” and “1,” and the sequences derived from them are referred to as “bit strings.” It turns out that any finite fixed alphabet with n characters can be represented with bit strings of length k provided that n ≤ 2k. In fact, all of modern computation encodes the alphanumeric characters on our keyboards in terms of bits (cf. Unicode and the older ASCII).1