ABSTRACT

A Markov process is a stochastic process characterized by the Markov property that the distribution of the future process depends only on the current state, not on the whole history. If the state space consists of countably many states, the Markov process is called a Markov chain and, if there are absorbing states, it is called an absorbing Markov chain. Absorbing Markov chains play a prominent role in finance, for example, by identifying default as an absorbing state. This chapter explains the importance of discrete-time Markov chains by showing various examples from finance. See, e.g., C¸inlar (1975), Karlin and Taylor (1975), Anderson (1991), or Kijima (1997) for more information about Markov chains.