ABSTRACT

In Chapter 4, we transform several dimensionality reduction algorithms that exhibit the form of a specific type of generalized eigenvalue problem into an equivalent least squares formulation, which can be solved efficiently using existing algorithms such as the iterative conjugate gradient algorithm [94,178]. However, it suffers from several drawbacks. First, the equivalent transformation relies on a key assumption that all the data points are linearly independent. This assumption tends to hold for high-dimensional data, but it is likely to fail for (relatively) lower-dimensional data. Second, the equivalence relationship between the least squares formulation and the original formulation does not hold when regularization is applied in the generalized eigenvalue problem.