ABSTRACT

This chapter explains the backfitting algorithm which is justified as a method for estimating the additive model. It discusses the existence and uniqueness of the solutions to the additive-model estimating equations and the convergence of the backfitting algorithm. The chapter discusses some theory for standard-error bands and degrees of freedom of the estimated smooths, and the relationship of backfitting to the Gram-Schmidt and Gauss-Seidel techniques. Stone studied rates of convergence for additive models, with the functions estimated by polynomials or regression splines. A Hilbert-space version of the additive model and backfitting can be formulated, with conditional expectation operators playing the role of smoothers. Many smoothers have a projection part and a shrinking part. The idea is to combine all of the projection operations for all of the predictors into one large projection, and use only the nonprojection parts of each smoother in an iterative backfitting-type operation.