ABSTRACT

In the present chapter, we continue to concern ourselves with the implementation of associative memories by means of artificial neural networks. One of the major difficulties encountered in VLSI im­ plementations of artificial neural networks is the realization of ex­ tremely large numbers of interconnections in the network. To reduce the number of connections is therefore of great interest from a prac­ tical point of view. Many of the existing synthesis procedures for associative memories were developed for fully interconnected neural networks and they do not result in neural networks with prespecified partial or sparse interconnection structure. Synthesis procedures for neural networks with arbitrarily (prespecified) sparse interconnection structure, or equivalently, with sparse coefficient matrix, constitute a major addition to the development of neural network theory, and such procedures have potentially many practical applications, espe­ cially in the areas of associative memories and pattern recognition. (We will define the precise meaning of sparse coefficient matrix later.)

In one of the few existing works dealing with sparsity constraints

(using the discrete-time Hopfield model), it is proposed to transform a given neural network into a partially connected or cellular network [6]. However, as pointed out in [6], “the application of the suggested transformation algorithm is severely limited by its quickly growing complexity.”