ABSTRACT

This chapter introduces the theory that is necessary in order to describe the procedures. It describes the main three elements of inference, namely, estimation, tests of hypotheses, and forecasting future observations follows. The chapter explains the important standard distributions for Bayesian inference, namely, the Bernoulli, beta, multinomial, Dirichlet, normal, gamma, normal–gamma, multivariate normal, Wishart, normal–Wishart, and multivariate t-distributions. The multinomial and Dirichlet are the foundation for the Bayesian analysis of Markov chains and Markov jump processes. Posterior inferences by direct sampling methods are easily done if the relevant random number generators are available. Inferences for stochastic processes consist of testing hypotheses about unknown population parameters, estimation of those parameters, and forecasting future observations. A formula for the posterior probability of the null hypothesis is derived, via the Bayes theorem, and illustrated for Bernoulli and Poisson. The most useful population for Markov chains is the binomial population, which models the number of one-step transitions among the states of a finite chain.