ABSTRACT

Simulation has been part of statistics since near its very beginnings. However, due to the memory requirements needed to perform all but the most elementary simulations, the method was not used until recently. Markov chain sampling, for example, began with Metropolis et al. (1953) as an application in physics. It was later refashioned as a statistical sampling method by Hastings (1970). The sampling method he proposed became known as the MetropolisHastings sampling algorithm. We discuss the method and provide full working code for estimating a Bayesian Poisson model in the final section of the chapter. A further advance to the methodology was made by statisticians Stuart and Donald Geman in 1984 when they published a sampling method based on the Metropolis-Hastings algorithm (Geman and Geman, 1984). They named the method Gibbs sampling, after Josiah Gibbs of Yale University, one of the leading physicists and engineers in the 19th century. In 1863 Gibbs was awarded the first PhD in engineering in the United States. Gibbs is perhaps best known for his work in developing the area of statistical mechanics with Ludwig Boltzman and James Clerk Maxwell, as well as being the inventor of vector calculus. Today the Metropolis-Hastings sampling algorithm and Gibbs sampling are the two foremost methods of sampling used in Bayesian modelling. Here, we cover only the former.