ABSTRACT

We do not expect the Markov chain to converge: since transition probabilities are homogeneous, the probability of leaving a given state does not change in time. However, in analogy with i.i.d. random variables, there are limit theorems under the right scaling. The connection to the i.i.d. case comes from the following construction. Fix an arbitrary state j and define

R1 = T1 = τj Rk = inf {t > Rk−1 : Xt = j, k > 1} Tk = Rk −Rk−1, k > 1.