ABSTRACT

Least squares parameter estimators have long been advocated because of their optimal statistical properties. Several characteristics of least squares estimation contribute to its popularity. Least squares estimators are comparatively easy to compute. They are unbiased estimators of the regression parameters and, while they are not the only unbiased estimators, they have the smallest variances of all unbiased linear functions of the responses under some relatively mild assumptions on the true model. A reason for considering fitted or theoretical regression models to be approximations is that one often does not collect data on the response variable over the entire range of interest of the predictor variables. Transformation of the response and predictor variables is often used not only to linearize a theoretical or empirical relationship, but also to enable estimated regression coefficients to be more directly comparable. This chapter prior to fitting single-variable models concerns the question of whether an intercept parameter should be included in the model specification.