ABSTRACT

This chapter reviews a number of key elements of the basic theory of parametric statistical inference: likelihood and likelihood quantities, the closely associated ideas of sufficiency, conditioning and ancillarity, model adequacy and parameter orthogonality. It discusses the main types of statistical models: transformation models, the nicest exponential models, which it term prime exponential models, and curved sub-models of these. The contrast between some equations illustrates the simplifications gained within the class of exponential models. The same ideas apply to more complex problems, for example to stochastic processes in continuous time, to mixtures of discrete and continuous responses, and to optimal stopping. A mathematically careful account of the log-likelihood for this process, and more generally of any stochastic process in continuous time, demands more attention to the limiting processes involved in passing from discrete to continuous time.