ABSTRACT

Composite neural networks consist of multilayer networks, in which each layer may use different models of neurons: the classical sigmoidal neuron, the kernel neuron (like radial basis function neurons), the logical neuron, and so on. This section is devoted to supervised composite neural networks and contains three main parts. The first is focused on radial basis function (RBF) networks, as introduced by Poggio and Girosi. The second presents a special class of neural Bayesian classifier based on the kernel density estimator. In the third part, we briefly explain neural tree architectures, and the architecture of the well-known restricted Coulomb energy (RCE) algorithm, stressing their limitations.