ABSTRACT

The Griddy-Gibbs (GG) sampling algorithm is a Markov chain Monte Carlo (MCMC) method that was first introduced by [10]. By its name, GG is a Gibbs algorithm where each parameter can be sampled directly from its full conditional distribution with 100% acceptance rate. However, a useful and distinctive feature from the traditional Gibbs sampler is that the GG sampler can be applied to the cases where the likelihood function or the posterior density is intractable. Under the GG, the possible range of each parameter is first divided into a number of grid points. Then, parameter value at each grid point is evaluated through the full conditional density to obtain an approximate cumulative distribution function (CDF). After that, the realization

in Bayesian

each MCMC iteration is obtained using the inverse CDF method. The ranges of the parameters can be fine-tuned in the adaptive GG sampler to improve the performance of the algorithm. However, the use of the adaptive GG sampler in Bayesian computation has been limited, as only one parameter can be sampled at a time and each parameter has a number of grid points that need to be evaluated (though unnecessary) sequentially through its full conditional density. All this makes the algorithm very time consuming.