ABSTRACT

In this chapter, we describe classical inference for the general linear model that we introduced in Chapter 4. The least squares approach did not require any distributional assumptions on the model errors. However, in many cases, one is interested in constructing interval estimates for the parameters of interest, as well as in testing hypotheses about these parameters or functions of these parameters. Parametric inference proceeds by assuming that the errors are generated by some probability model. The most widely used assumption is that of normality, and the multivariate normal distribution was defined in Chapter 5. In Chapter 6, we looked at methods for assessing multivariate normality and studied some transformations to normality. The properties of the multivariate normal distribution ensures that the least squares solutions of the general linear model parameters are also multivariate normal, as Result 7.1.1 shows. Using results from distributions of quadratic forms that were discussed in section 5.4, we derive tests of hypotheses in section 7.2, followed by a discussion of a nested sequence of hypotheses in section 7.3. Section 7.4 describes construction of marginal confidence intervals and joint confidence regions for suitable functions of model parameters.