ABSTRACT

Reservoir computing represents a subset of neural network models. In a hardware context, reservoirs require far fewer tunable elements than traditional neural network models to run effectively. The ‘photonic reservoirs’ utilized optical multiplexing strategies to form highly complex virtual networks. A central tenet of reservoir computing is that complex processes are generated in a medium whose behavior is not necessarily understood theoretically. The dimensionality of the output feature vector increases with the number of reservoir variables, and dimensionality can be a key limiter in the convergence rate of known machine learning methods. The reservoir itself is composed of a large number of nonlinear nodes that are randomly interconnected with fixed weights, constituting a recurrent network. In a machine learning context, the reservoir concept can be thought of as a kernel method. Reservoir techniques are an extension of kernel methods because they generate nonlinear functions of the original inputs prior to a trained linear classifier.