ABSTRACT

This chapter shows that linear biased estimators can locally improve the least squares estimators. It presents three shrinking biased estimators: the ridge estimator, the generalized ridge estimator, and the principal component estimator. The chapter explains the Stein estimator which is also a biased shrinking estimator. A stepwise regression procedure usually begins with no variables in the model and subsequently increases the number of variables in the model until the model is satisfactory. The order of the variables to be included in the model is determined based on its partial correlation coefficient, which measures the importance of the variables not yet in model. The most widely used transformation is the Box– Cox transformation which is a simple power transformation of the response variable. The Box– Cox transformation may lead to a linear model with normally distributed error and homogeneous variance.