ABSTRACT

In Probability Theory, any collection of r.v.’s {Xt}, where t is a running parameter, is called a stochastic or random process. Usually, though not always, t is a time parameter or is viewed as such, and Xt describes the evolution of a system in time. If t = 0,1,2, ... , we talk about a random sequence or a process in discrete time. In this chapter, we consider a particular but important type of such processes, namely, Markov chains.