ABSTRACT

We compare two unsupervised neural network models that perform redundancy reduction on static input data. One was introduced by Peter Földiàk, and the other by Jürgen Schmidhuber. We also extend their functionality in order to deal with dynamic time varying pattern sequences and perform comparisons of the models. The results we obtain are consistent across both static and dynamic input data. Though the two models discover very different types of codes, they perform in a similar fashion. With increasing code size and bit probability, the entropy of the codes increases, and at the same time the redundancy and dependency between code symbols increase. In general, trade-off decisions have to be made between information preservation, redundancy/ dependency, generalization/comprehensibility, density, distribution and sparseness of the code. The outcome of this analysis determines which of the versions of which model is most suitable.