ABSTRACT

This chapter requires a brief preamble. The subject of linear statistical models is very broad. There are quite a few good books on the subject, and it is often the case that universities offer separate courses on particular aspects of this subfield of Statistics. Most often, the subject of linear models is separated into two parts, regression analysis and the analysis of variance (ANOVA), and each of these parts is taught in a course exclusively devoted to it. In a single chapter of a book like the present one, the scope of coverage of linear model theory is necessarily limited. I will cover only the simplest models in both of these areas: simple linear (or straight-line) regression and one-way (or single-factor) analysis of variance. I will discuss some of the mathematical aspects of both topics, my aim being to illustrate how the concepts we have studied so far shed some light on the behavior of, and the justification for, the standard procedures in these two areas. For example, I will show that, even without the usual assumption of normality, the standard estimators of the coefficients in a simple linear regression model are the best linear unbiased estimators of these parameters, and that under the assumption of normality, the standard estimators of the coefficients in a simple linear regression model are the maximum likelihood estimators of these parameters. I will also show that the standard F test in one-way ANOVA may be derived as a likelihood ratio test.