ABSTRACT

This chapter defines the h-likelihood and provides insight to inference and predictions based on the h-likelihood. It introduces the extended likelihood principle underlying the h-likelihood framework and shows how it is related both to classical likelihood and Bayesian inference. Birnbaum proved that the classical likelihood function contains all the information in the observed data about the fixed parameter. In the extended likelihood framework the ? appear in the stochastic model as random instances, but in statistical inference as unknowns. For the linear mixed model both the marginal likelihood and REML likelihood are straightforward to derive, but for most other distributions the integral for the marginal likelihood has no analytical form and some approximation is required. Inference on random effects have important practical use in predictions. Lee and Nelder showed that their h-likelihood approach can be used for inference about general models including unobservable random variables, which include future outcomes, missing data, latent variables, factors in factor analysis, potential outcomes, etc.