ABSTRACT

This chapter turns from the predictor-outcome procedures of multiple regressions to the analysis of a single set of variables. When there are substantial correlations among the members of a set of variables, that set carries, in a sense, less information than it would appear from the number of variables it contains. Near multicollinearity is an example of this phenomenon. Although p vectors may be measured, they lie very close to a space of fewer than p dimensions. Both principal-component analysis and factor analysis take a collection of variables, examine its correlational structure, and extract its principal dimensions of variation. By reducing a large set of variables to a smaller one, they can locate patterns in the data and considerably simplify the subsequent analysis of the variables. The equality of the coefficients and the angular loadings is a consequence of the orthogonality built into the factor model.