ABSTRACT

This chapter presents an axiomatic approach for reformulating radial basis function (RBF) neural networks. With this approach the construction of admissible RBF models is reduced to the selection of generator functions that satisfy certain properties. The selection of specific generator functions is based on criteria which relate to their behavior when the training of reformulated RBF networks is performed by gradient descent. This chapter also presents batch and sequential learning algorithms developed for reformulated RBF networks using gradient descent. These algorithms are used to train reformulated RBF networks to recognize handwritten digits from the NIST databases.