ABSTRACT

In this paper, we investigate the intriguing relationship between the Shannon mutual information and Shannon inequality using Kullback-Leibler divergence as a medium. Inspired by its connection, we propose and define a concept of general-propose dependence measure, namely generalized dependence measure. The motivation of this concept is to construct novel independence measures based on inequality theory (Rassias, 2000). According to definitions of generalized dependence measures, several classical dependence measures are unified into a framework of inequality theory. Although the concept of generalized dependence measure does not rigorously satisfy the definition of conventional distance, it expands the possibility of measuring the independence of random variables.