ABSTRACT

The objective of chapter 11 is to provide explicit conditions for ensuring the convergence of a large class of commonly used Monte Carlo Markov chain (MCMC) stochastic search algorithms for minimizing an objective function or computing computationally intractable expectations. The chapter begins by showing that an irreducible and aperiodic first-order Markov chain converges in distribution to a stationary distribution on a finite state space at a geometric convergence rate. Methods for assessing convergence of the chain are also provided. Next, the Metropolis-Hastings algorithm is introduced for the purpose of constructing an irreducible and aperiodic first-order Markov chain that will converge in distribution to a pre-specified Gibbs probability mass function on a finite state space. An explicit convergence theorem is provided with a procedure for checking convergence conditions. Special cases of the Metropolis-Hastings algorithm are then discussed including: the independence sampler, the Metropolis algorithm, the random scan Gibbs sampler, the deterministic scan Gibbs sampler, and the random scan block Gibbs sampler. The Metropolis-Hastings convergence theorem is then used to analyze the convergence properties of model search algorithms such as genetic algorithms and MCMC model averaging methods.