ABSTRACT

Entropy is an important concept in both physics (thermodynamics) (Carnap, 1977; Denbigh and Denbigh, 1985; Kittel, 1956 and 1958) and information engineering (Shannon and Weaver, 1949; Coleman, 1975; Theil, 1972). In physics, entropy refers to the degree of disorder, randomness, or unpredictability in a substance or physical system. In information theory, it measures the information content of a message, evaluated as to its uncertainty. Entropy in these two fields provides the mathematical model from which a family of six inequality measures is derived as well as the rationale for interpreting those measures.