ABSTRACT

In general, one can say that adaptive algorithms are nothing but iterative search algorithms derived from minimizing a cost function with the true statistics replaced by their estimates. To study the adaptive algorithms, it is necessary to have a thorough understanding of the iterative algorithms and their convergence properties. This chapter discusses the steepest descent method and the Newton’s method. The presence of correlation matrix causes the eigenvalue spread problem to be exaggerated. This is because, if the ratio of the maximum to the minimum eigenvalues (i.e., eigenvalue spread) is large, the inverse of the correlation matrix may have large elements, and this will result in difficulties in solving equations that involve inverse correlation matrices. To apply the method of steepest descent, we must find first estimates of the autocorrelation matrix and the cross-correlation vector from the data. This is necessary, since we do not have an ensemble of data to find autocorrelation matrix and cross-correlation vector.