ABSTRACT

Eigenvalues and eigenvectors of a matrix provide a fundamental characterisation of the matrix and are central to many of the theoretical results on matrices. They have a close connection to determinants and they provide a representation that permits definition of fractional powers of a matrix, i.e., square and cube roots. In most of this chapter the matrices considered are real symmetric matrices. In statistics, the eigenanalysis of the variance matrix is the basis of many statistical analyses. This matrix is necessarily symmetric and also it is positive semi-definite. Principal component analysis is a fundamental tool of multivariate analysis and this involves projecting (or more exactly translating and rotating) the data onto the eigenvectors of the variance matrix. The advantages of doing this are explained in the chapter. Other techniques of multivariate analysis such as linear discriminant analysis, canonical correlation analysis and partial least squares all rest on the eigenanalysis of various matrices derived from the data.