ABSTRACT

Information theory for lattices can be used to characterize the spatial structure as well as the spacetime pattern in discrete dynamical systems. In this paper we shall review some of the formalism useful for analyzing correlations and randomness in lattice systems of any dimension.

The metric entropy, the average entropy per lattice site, is a quantity expressing the degree of randomness of a system. For a spatial pattern that is a microstate in a statistical mechanics system, the metric entropy equals the statistical mechanics entropy. But for a pattern that is the state of a macroscopic system, the metric entropy measures disorder of a higher level. We shall also discuss some complexity measures that quantify the degree to which correlation information is distributed throughout the system.

The formalism is applied to deterministic and probabilistic cellular automata. The temporal behavior of the spatial metric entropy is analyzed and its relevance to statistical mechanics is discussed. For a general class of lattice gas models, the metric entropy is nondecreasing in time. The correspondence between the metric entropy and the statistical mechanics entropy shows the close relation of this result to the second law of thermodynamics.

The equivalence between n-dimensional cellular automata and (n+l)-dimensional statistical mechanics models is discussed. This correspondence is illustrated by a class of simple probabilistic filter automata that generate spacetime patterns similar to the configurations seen in the two-dimensional Ising model.