ABSTRACT

Put loosely, the law of large numbers (LLN) says that the average of a large number of independent, or nearly independent, random variables is usually close to its mean. For some of the mathematics that typically arise in neural modelling, this s~mple principle has a natural and rewarding application. In one version of this application, equations for the development of long term memory traces (usually modelled as changes in "synaptic efficacies") are well approximated by more elementary equations, and from these the performance of the model can be more easily anticipated. In a second version, a large system of equations modelling the individual activities of interconnected homogeneous populations of neurons is replaced by a small number of prototype equations which accurately describe the macroscopic dynamics of the network. Models of this latter type might be relevant, for example, to the generation of phrenic nerve activity by the brainstem respiratory centers.