ABSTRACT

A support vector machine (SVM) is a classification technique for two classes. For 2-dimensional data, SVM calculates an optimal straight line that separates the two classes. For d-dimensional data, it calculates an optimal hyperplane. Real-world data are not always linearly separated, and two classes may have a non-linear boundary. Non-linear boundaries can also be determined with SVMs after the data is transformed to a higher-dimensional space using a kernel function, where they actually become linearly separated. In this Chapter, the following topics are presented: the SVM algorithm for linearly separable data, the soft -margin SVM (C-SVM) algorithm, the SVM algorithm for non-linearly separable data, how to select the hyper-parameters of the SVM algorithm, and the ε-support vector regression (ε-SVR). A set of exercises and an SVM project is given at the end of the Chapter.