ABSTRACT

Markov property: A discrete time Markov chain Xn is a process of discrete time and discrete states which satises the following Markov property:

P (Xn+1 = j j X0 = i0; X1 = i1; : : : ; Xn = in) = P (Xn+1 = j j Xn = in) (4.1)

for any integer time n 0 and states j; i0; i1; : : : ; in, provided P (X0 = i0; : : : ; Xn = in) > 0:

It means that given the present (time n), the future distribution of the process (at time n + 1) will not depend on the past (at times 0; 1; 2; : : : ; n 1). Transition probability and time homogeneity: The conditional probability

pij = P (Xn+1 = j j Xn = i) (4.2) is called the (1-step) transition probability from state i to state j at time n. In many applications, the quantity is independent of the time n, and then the Markov chain is called time homogeneous. In the sequel, a MC (Markov chain) is always assumed to be time homogeneous unless explicitly stated otherwise.