ABSTRACT

Probability theory is a key ingredient to a statistical analysis, and this chapter reviews the most relevant concepts of probability for a Bayesian analysis. The marginal and conditional distributions that follow from a multivariate distribution are a key to a Bayesian analysis. The objective is to compute the posterior distribution of the unknown parameters θ. The posterior has two ingredients: the likelihood of the data given the parameters and the prior distribution. The final output of a Bayesian analysis is the posterior distribution of the model parameters. The posterior contains all the relevant information from the data and the prior and thus all statistical inference should be based on the posterior distribution. The Bayesian framework provides a logically consistent framework to use all available information to quantify uncertainty about model parameters. However, to apply Bayes’ rule requires specifying the prior distribution and the likelihood function.