ABSTRACT

Information theory was founded in 1948 and described in Shannon’s paper “A Mathematical Theory of Communication” [107]. In this paper, Shannon defined entropy and mutual information (initially called the rate of transmission), and introduced the fundamental laws of data compression and transmission. In information theory, information is simply the outcome of a selection among a finite number of possibilities and an information source is modeled as a random variable or as a random process. While Shannon entropy expresses the uncertainty or the information content of a single random variable, mutual information quantifies the dependence between two random variables and plays an important role in the analysis of a communication channel , a system in which the output depends probabilistically on its input [33, 134, 148]. From its birth to date, information theory has interacted with many different fields, such as statistical inference, computer science, mathematics, physics, chemistry, economics, and biology.