ABSTRACT

We know that increasing the number of weights in a fully-trained neural network makes for better generalization, up to a point, and smaller training set error. In training, increasing the training time likewise makes for better generalization, up to a point, and smaller training set error. It seems as though extra weights sneak into the net during training. Certainly, complexity increases with number of weights, so we conclude that with training there is increasing “apparent complexity.”