ABSTRACT

This chapter discusses the most frequently used back-propagation learning rule of neural networks. Advances in the technology of very large scale integrated systems have led to greatly revitalized interest in artificial neural networks. The processing of information by static, feed forward back propagation neural networks is generally defined as fully concurrent and asynchronous. In some applications the inputs to the neural networks are assumed to be binary, while other application assumes analog inputs. The neural network, as a system of highly interconnected adaptive neuron like elements, can be trained to accomplish the particular processing of input signals. The Hopfield-type dynamic neural networks have a long history as analog computers constituted as electrical networks consisting of interconnected amplifiers. The optimization computational problem first must be coded in a form suitable for neural network processing. The traveling salesman problem belongs to the class of "hard" combinatorial optimization problems.