ABSTRACT

This chapter introduces linear classification as simple projections onto a linear subspace. It aims to develop the idea of maximizing the separation deriving Fisher’s discriminant analysis and Linear Discriminant Analysis. The Perceptron was invented in 1957 by Rosenblatt, and built in 1958 as a physical machine for image recognition. The chapter shows that the perceptron is a simple neural network. The perceptron is implemented as a single layer neural network within the neural network framework in MATLAB. The MATLAB code uses the Support Vector Machine implementation to calculate the support vectors and plot the separation line. The Support Vector Machine is derived as maximizing the margin between classes. The Lagrangian function combines the objective function and the constraints into one function. The chapter provides advantages of recasting the problem by examining the relationships between samples when considering non-linear classification and the kernel trick kernel trick.