ABSTRACT

There are three principal uses of multiple regressions: testing hypotheses, determining coefficients of independent variables and forecasting. Multiple regressions have the potential for greater accuracy than simple regression because more explanatory independent variables may be introduced into the regression model. Of course, introducing more independent variables increases the complexity of the regression model and the cost of collecting and storing the required past data. A regression software package initially provides a correlation matrix for the regression. This matrix remains constant and need by computed only once prior to any regressions being performed. The symmetric correlation matrix indicates the statistical linear correlation among the independent variables and to the dependent variable. A basic assumption of a multiple regression analysis which determines the best fit line to a set of data is the linearity of the data analyzed. It has been demonstrated that many non-linear functions can be transformed into linear functions.