ABSTRACT
Artificial neural networks (ANNs) are simplified mathematical approximations
of biological neural networks in terms of structure as well as function. In
general, there are two aspects of ANN functioning: (1) the mechanism of
information flow starting from the presynaptic neuron to the postsynaptic
neuron across the network and (2) the mechanism of learning that dictates
the adjustment of measures of synaptic strength to minimize a selected cost
or error function (a measure of the difference between the ANN output and
the desired output). Research in these areas has resulted in a wide variety
of powerful ANNs based on novel formulations of the input space, neuron,
type and number of synaptic connections, direction of information flow in the
ANN, cost or error function, learning mechanism, output space, and various
combinations of these.