ABSTRACT

In this paper we present a systematic approach to constructing neural network classifiers based on stochastic model theory. A two step process is described where the first problem is to model the stochastic relationship between sample patterns and their classes using a stochastic neural network. Then we convert the stochastic network to a deterministic one, which calculates the a-posteriori probabilities of the stochastic counterpart. That is, the outputs of the final network estimate a-posteriori probabilities by construction. The well-known method of normalizing network outputs by applying the softmax function in order to allow a probabilistic interpretation is shown to be more than a heuristic, since it is well-founded in the context of stochastic networks. Simulation results show a performance of our networks superior to standard multilayer networks in the case of few training samples and a large number of classes.