ABSTRACT

CHAPTER 4

Transition Matrices, Markov Chains

•4.1 Throughout this chapter we denote by E a countable set (possibly finite). Definition 4.1 A transition matrix on E is a family of real numbers (P (x, y))x,y∈E such that, for every x, y ∈ E,

P(x, y) ≥ 0, ∑ y∈E

P (x, y) = 1. (4.1)

Thus, for every x ∈ E, P(x, · ) is a probability on E. Definition 4.2 Let µ be a probability and P a transition matrix on E. We call a (homogeneous) Markov chain with initial law µ and transition matrix P a stochastic process (,F , (Fn)n≥0, (Xn)n≥0,P) with values in E such that

i) P(X0 ∈ A) = µ(A), for every A ⊂ E ii) P(Xn+1 ∈ A |Fn) = P(Xn,A) a.s., for every A ⊂ E and n ≥ 0.