ABSTRACT

In the discussion of the nearest-neighbor classifier (section 4.1.2), we viewed the classifier in terms of the decision boundary that it defines in the feature space, as, for example, in figure 4.4. This can be a very profitable way of viewing classifiers in general, and a number of methods focus explicitly on constructing good boundaries. We will discuss those methods in the next chapter. Before we can do so, however, we require some additional mathematical background. This chapter treats three topics in linear algebra: the formal representation

of a high-dimensional linear boundary, that is, a hyperplane; the gradient, along with rules for taking derivatives of linear-algebraic expressions; and constrained optimization.