ABSTRACT

Training difficulty in feedforward neural networks has been studied since the resurge of the Back Propagation technique. By a new way in which symmetry group and Pólya Theorem are used, this paper discusses how many different kinds of training difficulty (DKTD) are there to the maximum if the training patterns are binary. The results for the case of N ⩽ 3 are given where N is the dimension of input patterns. We find that the above numbers are much smaller than those we usually think intuitively and the ratios of them decrease greatly as N increases, i.e., 0.444, 0.1605, 0.0236 ⋯.