ABSTRACT

The input is a sequence of items, and the output depends on certain patterns in that sequence. Neural networks, however, challenge what we know about this, because there is neither control flow nor recursion. Those concepts need to be invented here, from scratch, and they take very different forms from the ones we are used to in traditional programming. The input sequence is shifted by one with respect to the output sequence. Specifically, that function is not just generating inputs and corresponding outputs, like before, but is also performing a shift between the input sequence and the output sequence. One of the earliest works studying neural networks “with cycles” that were able to store state over time was published by W. Little in 1974. A few years later, those ideas were popularized by John Hopefield, in what is now known as Hopefield networks.