ABSTRACT

The conjugate gradient method was introduced in Chapter 3. In this chapter we show each iterate of the conjugate gradient method can be expressed as the initial guess plus a linear combination of the Air(x0) where r0 = r(x0) = d−Ax0 is the initial residual and Air(x0) are called the Krylov vectors. Here xm = x0+ c0r0 + c1Ar0+ · · ·+ cm−1Am−1r0 and the coefficients are the unknowns. They can be determined by either requiring J(xm) = 12(x

m)TAxm−(xm)Td, where A is a SPD matrix, to be a minimum, or to require R (xm) = r(xm)T r(xm) to be a minimum. The first case gives rise to the conjugate gradient method (CG), and the second case generates the generalized minimum residual method (GMRES). Both methods have many variations, and they can be accelerated by using suitable preconditioners, M , where one applies these methods to M−1Ax = M−1d. A number of preconditioners will be described, and the parallel aspects of these methods will be illustrated by MPI codes for preconditioned CG and a restarted version of GMRES.