ABSTRACT

Observations in brains of neuronal networks that subserve associative learning in living organisms have been exceedingly sparse until the past decade. Recently, some fundamental biophysical and biochemical properties of biological neural networks that demonstrate associative learning have been revealed in the marine mollusc, Hermissenda crassicornis. In mammals, we have localized distributed changes, specific to associative memory, in dendritic regions within biological neural networks. Based on these findings, it has been possible to construct an artificial neural network, Dystal (dynamically stable associative learning) that utilizes non-Hebbian learning rules and displays a number of useful properties, including self-organization; monotonic convergence; large storage capacity without saturation; computational complexity of O(N); the ability to learn, store, and recall associations among arbitrary, noisy patterns after four to eight training epochs; a weak dependence on global parameters; and the ability to intermix training and testing as new training information becomes available. The performance of the Dystal network is demonstrated on problems that include face recognition and hand-printed Kanji classification. The computational linearity of Dystal is demonstrated by its performance on a MasPar parallel hardware computer.