ABSTRACT

This is referred to as Jensen’s inequality. F is strictly convex if Jensen’s inequality holds with ≺ in the case that neither of the αk ′s equals one. If F is real valued, the inequalities are the same as the usual ≤ and < for real numbers. Generally it is not easy to verify whether a function is convex. Twice continuously differentiable functions F : S → R on convex sets S with interior points are convex iff their Hessian satisfies ∂2F(x) 0 for all x ∈ S. If F defined on S is convex, then the sublevel sets {x ∈ S | F(x) ≺ H} are convex for any H ∈ Hm. The most important reason for considering convex functions in optimization is the fact that local minimal points of convex functions are actually global minimal points. Precisely, x0 ∈ S is a local minimal point of F if there exists an open neighborhood N (x0) of x0 such that F(x0) F(x) for all x ∈ N (x0)∩S. If F is convex one can conclude that F(x0) F(x) for all x ∈ S, showing that x0 is a global minimal point of F. This property is of great interest in numerical optimization. Indeed, many efficient algorithms exist for the numerical computation of local minima of real-valued functions. If these are convex, such algorithms actually determine global minimal points.