ABSTRACT

In this chapter, we introduce matrix algebra in order to (i) define the multiple regression model and its assumptions, (ii) define the ordinary least square (OLS) estimators, and (iii) prove their unbiasedness in the random-X case under the Gauss-Markov model, noting that this includes the fixed-X model as a special case. Standard errors of the OLS estimates are also defined in matrix terms, and the results from this chapter are applied to an example where students grade point averages are predicted.