chapter  11
60 Pages

Eigenvalues and Eigenvectors

When we multiply a vector x with a matrix A, we transform the vector x to a new vector Ax. Usually, this changes the direction of the vector. Certain exceptional vectors, x, which have the same direction asAx, play a central role in linear algebra. For such an exceptional vector, the vectorAx will be a scalar λ times the original x. For this to make sense, clearly the matrixA must be a square matrix. Otherwise, the number of elements in Ax will be different from x and they will not reside in the same subspace. (In this chapter we will assume all matrices are square.) This yields the following equation for a square matrixA:

Ax = λx or (A− λI)x = 0 . (11.1) Obviously x = 0 always satisfies the above equation for every λ. That is the trivial solution. The more interesting case is when x 6= 0. A nonzero vector x satisfying (11.1) is called an eigenvector of A and the scalar λ is called an eigenvalue of A. Clearly, if the eigenvalue is zero, then any vector in the null space of A is an eigenvector. This shows that there can be more than one eigenvector associated with an eigenvalue. Any nonzero x satisfying (11.1) for a particular eigenvector λ is referred to as an eigenvector associated with the eigenvalue λ.