ABSTRACT

This chapter discusses the advantages of the neuron by neuron (NBN) algorithm and presents the NBN algorithm for calculating the Jacobian matrix for arbitrarily connected feedforward. Compared to with the well-known Levenberg–Marquardt (LM) algorithm, the NBN algorithm has several advantages: the ability to handle arbitrarily connected neural networks, forward-only computation and the direct computation of quasi-Hessian matrix. The NBN algorithm is developed for training arbitrarily connected neural networks using the LM update rule. In the forward computation, the neurons connected to the network inputs are first processed so that their outputs can be used as inputs to the subsequent neurons. The sequence of the backward computation is opposite to the forward computation sequence. By applying all training patterns, the whole Jacobian matrix can be calculated and stored. The proposed computation was derived for fully connected neural networks. Error correction is an extension of parity-N problems for multiple parity bits.