ABSTRACT

Principal Component Analysis (PCA) is a fundamental, classical approach for dimensionality reduction. But its major limitation is its inability to handle nonlinear data. Since PCA is limited to linear dimensionality reduction where high dimensional data lie on a linear subspace, we use a kernel-based method which is a nonlinear generalization of PCA, to handle nonlinear data. This method is based on the “kernel trick” where the inner products are replaced with a kernel. The kernel function can be considered as a nonlinear similarity measure and many linear approaches can be generalized to nonlinear methods by exploiting the “kernel trick.” This variant of PCA uses kernels to compute the principal components. It works well on nonlinear data and overcomes the limitation of PCA in addressing the nonlinearity of the data. This technique, called Kernel PCA, is discussed in this chapter. Alongside, we also discuss its applications and some examples.