ABSTRACT

The objective of Chapter 9 is to provide an introduction to stochastic sequences consisting of random vectors which is relevant to the remaining chapters of the book. The chapter begins by defining important types of data generating processes which are relevant to machine learning problems. Such data generating process stochastic sequences include: stationary stochastic sequences, i.i.d. stochastic sequences, bounded stochastic sequences, and partially observable stochastic sequences. The chapter emphasizes that under the assumption that learning takes place in a statistical environment, that the learning process must be characterized using concepts of stochastic convergence rather than deterministic convergence. The important concepts of convergence with probability one, convergence in probability, convergence in mean-square, and convergence in distribution are then introduced with examples. In addition, the relationships between the different types of stochastic convergence are formally discussed. Stochastic convergence concepts are illustrated by introducing the weak and strong law of large numbers, the uniform law of large numbers, and the multivariate central limit theorem. Consequences of combining and transforming different types of convergent stochastic sequences are discussed near the end of the chapter.