ABSTRACT

Information theory provides powerful tools that have been successfully applied in a great variety of diverse fields, including statistical learning theory, physics, communication theory, probability theory, statistics, economics, finance, and computer science. This chapter describes the interpretations and explore decision-theoretic generalizations of the fundamental quantities of information theory along the lines of Craig Friedman and Sven Sandow and Friedman et al. It shows that some of the quantities and results of classical information theory, have more general analogs. The Kullback-Leibler relative entropy satisfies the information inequality, that is, the relative entropy between two probability measures is nonnegative and is zero if and only if the measures are the same.