ABSTRACT

This chapter describes the matrices to write regression models. The economy of notation achieved through using matrices allows to arrive at some interesting new insights and to derive several of the important properties of regression analysis. The general linear model is a generalization of the matrix form for the simple linear regression model. The regression estimates given by standard computer programs are least squares estimates. In multiple regressions, simple algebraic expressions for the parameter estimates are not possible. The relationship between estimated principal component regression coefficients and original least squares regression coefficient estimates is somewhat simpler when using the covariance matrix. Principal components (PC) regression is a method designed to identify near redundancies among the predictor variables. It should be noted that PC regression is just as sensitive to violations of the assumptions as regular multiple regression.