ABSTRACT

One-dimensional problems are important less in their own right than as a part of larger problems. ‘Minimisation’ along a line is a part of both the conjugate gradients and variable metric methods for solution of general function minimisation problems, though in this book the search for a minimum will only proceed until a satisfactory new point has been found. Alternatively a linear search is useful when only one parameter is varied in a complicated function, for instance when trying to discover the behaviour of some model of a system to changes in one of the controls. Roots of functions of one variable are less commonly needed as a part of larger algorithms. They arise in attempts to minimise functions by setting derivatives to zero. This means maxima and saddle points are also found, so I do not recommend this approach in normal circumstances. Roots of polynomials are another problem which I normally avoid, as some clients have a nasty habit of trying to solve eigenproblems by means of the characteristic equation. The polynomial root-finding problem is very often inherently unstable in that very small changes in the polynomial coefficients give rise to large changes in the roots. Furthermore, this situation is easily worsened by ill chosen solution methods. The only genuine polynomial root-finding problem I have encountered in practice is the internal rate of return (example 12.4). However, accountants and economists have very good ideas about where they would like the root to be found, so I have not tried to develop general methods for finding all the roots of a polynomial, for instance by methods such as those discussed by Jenkins and Traub (1975). Some experiments I have performed with S G Nash (unpublished) on the use of matrix eigenvalue algorithms applied to the companion matrices of polynomials were not very encouraging as to accuracy or speed, even though we had expected such methods to be slow.