ABSTRACT

This chapter presents some portions of the theory that derive from linear approximations of differentiable functions in the neighborhood of a point. This section is a non-technical overview of computational complexity and the role it plays in solving problems. In a minimization LP with objective function c·x, moving in the direction -c decreases the objective value at the quickest rate. Since the Hessian matrix is extremely likely to be symmetric, these results give us a nice characterization of positive definite Hessians. A symmetric matrix M is positive definite if all its eigenvalues are strictly positive. NLPs for which finding a local optimum is not adequate are called global optimization problems. With a few exceptions, global optimization comprises all of nonlinear optimization except convex optimization. Finding a global optimum with certainty in finite time is generally considered to be a strong performance guarantee.