ABSTRACT

The least squares (LS) minimization problem constitutes the core of many real-time signal processing problems, such as adaptive filtering, system identification and beamforming [1]. There are two common variations of the LS problem in adaptive signal processing:

Solve the minimization problem () w ( n ) = arg min w ( n ) ‖ B ( n ) ( X ( n ) w ( n ) − y ( n ) ) ‖ 2 , https://s3-euw1-ap-pe-df-pch-content-public-p.s3.eu-west-1.amazonaws.com/9781315214719/0fb871de-cdc7-4110-bdc0-1b5d88b350ec/content/eq481.tif"/>

where X(n) is a matrix of size n × p, w(n) is a vector of length p, y(n) is a vector of length n, and B(n) = diag{β n−1, β n−2, …, 1}, β is the forgetting factor and 0 < β < 1.

Solve the minimization problem in (1) subject to the linear constraints () c i T w ( n ) = r i , i = 1 , 2 , … , N , https://s3-euw1-ap-pe-df-pch-content-public-p.s3.eu-west-1.amazonaws.com/9781315214719/0fb871de-cdc7-4110-bdc0-1b5d88b350ec/content/eq482.tif"/>

where ci is a vector of length p and ri is a scalar. Here we consider only the special case of the MVDR (minimum variance distortionless response) beamforming problem [2] for which y(n) = 0 for all n, and (1) is solved by subjecting to each linear constraint; i.e., there are N linear-constrained LS problems.