ABSTRACT

A natural model for this situation is a mixture model with J components, namely

P(DIMk) = J P(DIBk, Mk)P(Bk1Mk)d0k, (10.3) where Bk is the vector of parameters of model M1c, and P(BklMk) its

density = 0, 1). One important use of the Bayes factor is as a summary of the evidence for

<1 <0 negative (supports Mo) 1 to 3 0 to 2 barely worth mentioning 3 to 12 2 to 5 positive 12 to 150 5 to 10 strong > 1.50 > 10 very strong

Table 10.l Calibration of the Bayes factor B10

The margina) Likelihoods yield posterior probabilities of all the models, as follows. Suppose that I< models, M·1, ... , MK, are being considered. Then, by Bayes theorem, the posterior probability of Mk is

(10.4)

E[LllD] = LLikP(MklD), (10.6) k=l

var[LllD] = L (var[LllD, Mk]+ Li~) P(Mk!D) - E[i'.\.JD]2, (10.7) k=l

where A.1c = E[LilD, M.,], the posterior mean under model Mk (Raftery, 1993a).