ABSTRACT

Artificial neural networks have been proposed as a tool for machine learning and many results have been obtained regarding their application to practical problems in robotics control, vision, pattern recognition, grammatical inferences, and other areas. In these roles, a neural network is trained to recognize complex associations between inputs and outputs that were presented during a supervised training cycle. The simplest type of feedforward neural net is the classical perceptron. This consists of one single neuron computing a threshold function. Threshold circuits, that is, feedforward nets with threshold activation functions, have been quite well studied, and upper/lower bounds for them have been obtained while computing various Boolean functions. Functions of special interest have been the parity function, computing the multiplication and division of binary numbers and so forth. Basic backpropagation is the most popular supervised learning method that is used to train multi-layer feedforward neural networks with differentiable transfer functions.