ABSTRACT

This chapter simplifies the linear regression problem considerably as the equivalent simultaneous linear equation system that elegant matrix operations efficiently solve via Matlab or other electronic equation solvers. The basic matrices include the n-row observation vector holding the dependent variable magnitudes of the n-observations, and the n-row by 2-column design matrix with ones in the first column and the independent variable coefficients of the n-observations in the second column. Taking the inverse of the product of the design matrix’ transpose times the design matrix and multiplying by the product of the design matrix’ transpose and observation vector yields the solution's 2-row column vector holding the least squares slope and intercept estimates. Comparably elegant matrix formulations on the measures of fit like the [1] solution variances, [2] solution and prediction confidence intervals, [3] correlation coefficient, and [4] ANOVA table are also developed. The rest of this book considers how these matrix formulations for the least squares linear regression problem in n≥[m=2 unknowns] also hold for any least squares problem in n≥[m>2 unknowns].