Gradient Descent Learning Algorithms: A Unified Perspective
The main contribution to the origin of the first idea is usually attributed to Hebb (1949). Loosely speaking, this is the idea that if the activities of two connected neurons are correlated over time, then the strength of the corresponding synapse should tend to increase. If the activities are not correlated, then the strength should tend to decrease. This formulation is obviously very vague, and various ways exist by which Hebbian learning rules can be implemented in neural networks. It is well known that, although these rules are very simple and local, they can lead to powerful self-organization effects in networks of simple model neurons (see, for instance, Linsker, 1988). Yet, it is also clear that a Hebb rule by itself cannot account for the learning of complex behaviors and that more global organizing principles are also required.