ABSTRACT

This chapter presents the analysis of conjugate gradient type methods with parameter n ≥ 1 including, e.g., mr and cgne. It will be shown that the iterates converge monotonically to T y if the right-hand side data y belong to ℛ(T). If y ∉ ℛ(T) then the iterates typically diverge to infinity in norm. Nevertheless, if the right-hand side yδ is an approximation of y ∈ ℛ(T), then some iterates approximate the exact solution T†y with order-optimal accuracy. The crux of the matter is to decide when this is the case. If the magnitude of the perturbation ║y – yδ ║ is known, such a decision can be based on the discrepancy principle. Otherwise, heuristic arguments are required to halt the iteration. One heuristic stopping rule is presented at the end of this chapter; it has the nice feature that it provides an a posteriori error estimate for the corresponding approximation.