ABSTRACT

This chapter describes the most standard Monte Carlo methods available for simulating from a posterior distribution associated with a mixture and conducts some experiments about the robustness of the Gibbs sampler in high-dimensional Gaussian settings. It provides some of the proposed solutions, from the original data augmentation of M. Tanner & W. Wong that pre-dated the Gibbs sampler of A. Gelman & G. King and J. Diebolt & C. Robert to specially designed algorithms, to the subtleties of label switching. The chapter explores inference strategy, which consists primarily of choosing relevant prior distributions. It aims to compare the computational performance of the different algorithms. Thus efficient Bayesian inference for model-based clustering requires Markov chain Monte Carlo (MCMC) algorithms working well and automatically in large dimensions with potentially numerous observations and smart strategies to derive a relevant number of clusters. In principle, since the posterior density can be computed up to a constant, MCMC algorithms should operate smoothly in the setting.