ABSTRACT

This chapter concerns the problem of optimisation, that is, finding the maximum or minimum of a function. The search for efficient optimisation techniques is one of the major endeavours of modern mathematics. We will consider this problem first in one dimension, then in higher dimensions. We will restrict our attention to maxima, but everything we say can be equally well applied to minima, by the simple expedient of multiplying the function by −1. For univariate functions we will consider Newton’s method and the goldensection method. For multivariate functions we will consider Newton’s method (again) and steepest ascent. We also provide some basic information about the optimisation tools that are available in R.