ABSTRACT

This chapter discusses to discrete-time stochastic processes with discrete state space Z which have the Markov property. A Markov chain is called homogeneous if it has homogeneous increments. A Markov chain is completely characterized by its transition matrix P and an initial distribution p(0). The Markov chain is said to be in a state of equilibrium, and the probabilities pi are also called equilibrium state probabilities of the Markov chain. Markov chains in discrete time virtually occur in all fields of science, engineering, operations research, economics, risk analysis, and finance. An irreducible Markov chain is either recurrent or transient. In particular, an irreducible Markov chain with finite state space is recurrent. The relationship between the multi-step transition probabilities of a discrete-time Markov chain is called the Chapman-Kolmogorov equations.