ABSTRACT

The field of information theory has its origin in Claude Shannon's 1948 paper, “A Mathematical Theory of Communication.” This chapter deals with Claude Shannon’s original problem. One should keep in mind that information theory is a growing field of research whose profound impact has reached various areas such as statistical physics, computer science, statistical inference, and probability theory. An important result of information theory is that, without loss of optimality, the encoder can be decomposed into two parts: a source encoder that produces a sequence of binary data and a channel encoder. Similarly, the decoder may be split into a channel decoder and a source decoder. One of the most remarkable results of information theory is a relatively easy-to-compute way to determine the operational capacity. The chapter considers discrete-time sources and models them as discrete-time stochastic processes.