ABSTRACT

New and better electronic devices have inspired researchers to build intelligent machines operating in a fashion similar to the human nervous system. A biological neuron is a complicated structure, which receives trains of pulses on hundreds of excitatory and inhibitory inputs. Those incoming pulses are summed with different weights during the time period of latent summation. The feedforward neural network is also used for nonlinear transformation of a multidimensional input variable into another multidimensional variable in the output. In theory, any input-output mapping should be possible if the neural network has enough neurons in hidden layers. Similarly to the biological neurons, the weights in artificial neurons are adjusted during a training procedure. Various learning algorithms were developed, and only a few are suitable for multilayer neuron networks. The correlation learning rule is based on a similar principle as the Hebbian learning rule.