ABSTRACT

In this Chapter we discuss various divergence measures proposed to discriminate between probability distributions on the basis of uncertainty. After the introductory section that distinguishes divergence and distance measures and a list of various measures available in literature. In Section 2 we describe the Kullback-Leibler divergence (KL). The residual divergence is considered in Section 3, conditions for monotonicity is derived and expression for KL is found in the case of equilibrium distributions, relevation transforms and series and parallel systems. Divergence function in the quantile framework and its past version are derived. In Section 3 we discuss the Renyi divergence and its properties. The quantile version gives interesting results that are not covered in the usual definition. This work is followed in Section 4 where the Varma divergence measure is considered. Thereafter the more general Csiszar family is reviewed. A special feature of all these divergences is their behavior in the proportional hazard family. Finally, we present the Chernoff distance in Section 5 and the Lin-Wong measure in Section 6.