ABSTRACT

This chapter provides the nature of the classifications which can be carried out and learned by Hopfield networks and equivalent networks. It focuses on other types of networks which can solve problems which cannot be solved by Hopfield networks. In fact, the ability of Hopfield networks to discriminate between patterns of input is equivalent to that of the perceptron, a machine which is already over twenty years old. The activity of a hidden unit can increase or decrease the error between the desired output pattern and the output pattern computed by the network. In the case of NETtalk, it would seem, as reported by T. Sejnowski and C. Rosenberg, that certain hidden units classify phonemes following classical phonetics rules, such as the distinction between vowels and consonants. The perceptron convergence theorem states that if there exists a function which can achieve the chosen classification, then the algorithm will converge in a finite time toward a satisfactory function Φ.