A Focused Backpropagation Algorithm for Temporal Pattern Recognition
Figure 1 depicts an abstract characterization of the temporal pattern recognition task. Time is quantized into discrete steps. A sequence of inputs is presented to the recognition system, one per time step. Each element of the sequence is represented as a vector of feature values. At each point in time, the system may be required to produce a response, also represented as a vector of feature values, contingent on the input sequence of that point. In the simplest case, shown in Figure 1, a response is required only after the entire input sequence has been presented.