ABSTRACT

Adaptive neural nets (ANNs) constitute a popular method for classification that can generate very complex decision boundaries yet generalize well. While the toolboxes provide more flexibility, implementing ANNs directly in MATLAB code provides more insight into their construction and operation. The McCullough–Pitts neuron can be trained using a very simple learning rule and this rule can be used to classify any linearly separable training set. The gradient decent method is very similar to the LMS algorithm used in adaptive filtering and even employs the same approximation developed by Widrow–Hoff. The most common technique for training multilayer nets is to project the error for the output layer back onto the preceding layers, an approach logically known as back projection. ANNs evolved from attempts by biomedical engineering pioneers to model biological neurons. The McCullough–Pitts neuron performs a weighted sum on its inputs just as the cell body is thought to do on dendritic inputs.