ABSTRACT

This chapter provides an introduction to support vector machines, kernel Fisher discriminant analysis, and kernel principal component analysis as examples for successful kernel-based learning methods. It presents a short background about Vapnik–Chervonenkis theory and kernel feature spaces and then proceeds to kernel-based learning in supervised and unsupervised scenarios, including practical and algorithmic considerations. The chapter illustrates the usefulness of kernel algorithms by finally discussing applications such as optical character recognition and DNA analysis. The kernel-Adatron is derived from the Adatron algorithm originally proposed by Anlauf and Biehl in a statistical mechanics setting. The kernel-Adatron constructs a large margin hyperplane using online learning. Its implementation is very simple. The chapter describes selected interesting applications of supervised and unsupervised learning with kernels. It demonstrates that kernel-based approaches achieve competitive results over a whole range of benchmarks with different noise levels and robustness requirements.