ABSTRACT

This chapter covers the most popular modified least mean square (LMS)-type algorithms proposed by researchers over the past years as well as some recent ones proposed by Poularikas and Ramadan. Most of these algorithms were designed on an ad hoc basis to improve convergence behavior, reduce computational requirements, and decrease the steady-state mean-square error (MSE). The chapter introduces the sign algorithms. The Variable step-size LMS algorithm was introduced to facilitate the conflicting requirements, whereas a large step-size parameter is needed for fast convergence and a small step-size parameter is needed to reduce the misadjustment factor. It turns out that when one of the eigenvalues of the correlation matrix is zero, the solution may become unstable. Therefore, it is important to stabilize the LMS algorithm. In all the analyses done for Wiener filtering problem—steepest-descent method, Newton's method, and LMS algorithm—no constrain was imposed on the solution of minimizing the MSE.