ABSTRACT

One of the few points on which theoretical statisticians of all persuasions are agreed is the importance of the role played by the likelihood function in statistical inference. In order to construct a likelihood function it is usually necessary to posit a probabilistic mechanism specifying, for a range of parameter values, the probabilities of all relevant samples that might possibly have been observed. For quasi-likelihood functions, the matrix plays the same role as the Fisher information for ordinary likelihood functions. The chapter shows that how inferences can be drawn from experiments in which there is insufficient information to construct a likelihood function. It focuses mainly on the case in which the observations are independent and where the effects of interest can be described by a model for E(Y). The log likelihood for the full data, which is the sum of the logarithms of such factors, is numerically and algebraically unpleasant.