ABSTRACT

The concept of entropy originated in thermodynamics and has a history of over a century and half dating back to Clausius in 1850. In 1870, Boltzmann developed a statistical definition of entropy and hence connected it to statistical mechanics. The concept of entropy was further advanced by Gibbs in thermodynamics and by von Neumann in quantum mechanics. Outside of the world of physics, it is Shannon who developed, in the late 1940s, the mathematical foundation of entropy and connected it to information. The informational entropy is now frequently called Shannon entropy or sometimes called Boltzmann-Gibbs-Shannon entropy. Kullback and Leibler (1951) developed the principle of minimum cross entropy (POMCE) and in the late 1950s Jaynes (1957a,b) developed the principle of maximum entropy (POME). Koutsoyiannis (2013, 2014) has given an excellent historical perspective on entropy. The Shannon entropy, POME, and POMCE constitute the entropy theory that has witnessed a wide spectrum of applications in virtually every field of science and engineering and social and economic sciences, and each year new applications continue to be reported (Singh, 2013, 2014, 2015). A review of entropy applications in hydrological and earth sciences is given in Singh (1997, 2010, 2011).