ABSTRACT

This chapter analyzes the efficiency of different network architectures, using parity-N problems. Parity-N problems have been studied deeply in many literatures. The N-bit parity function can be interpreted as a mapping that indicates whether the sum of the N elements of every binary vector is odd or even. Based on this pattern simplification, a linear neuron can be used as the network input. This linear neuron works as a summator. In Multilayer perceptron (MLP) networks, if connections across layers are permitted, then networks have bridged multilayer perceptron (BMLP) topologies. BMLP networks are more powerful than traditional MLP networks. Considering BMLP networks with only one hidden layer, all network inputs are also connected to the output neuron or neurons. If BMLP networks have more than one hidden layer, then the further reduction of the number of neurons are possible, for solving the same problem.