ABSTRACT

This chapter introduces the concept of thermodynamic entropy. It also introduces the concept of information entropy based on the mathematical definition of information. The chapter briefly describes the main features of the theory of (neg-)entropy-based information theory, along with their implications for a theory of neuronal information in connectionism, for example with reference to Paul Smolensky's Harmony Theory. Smolensky writes: In harmony theory, the concept of self-consistency plays the leading role. The theory extends the relationship that Claude E. Shannon exploited between information and physical entropy: Computational self-consistency is related to physical energy, and computational randomness to physical temperature. The centrality of the consistency or harmony function mirrors that of the energy or Hamiltonian function in statistical physics. Insights from statistical physics, adapted to the cognitive systems of harmony theory, can be exploited to relate the micro- and macrolevel accounts of the computation.