ABSTRACT

Operator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 188 5.4.2.1 Adjusted Normal Operator: Definitions . . . 188 5.4.2.2 Some Properties of the Adjusted Normal

Operator . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 191 5.5 Optimality Conditions for Quasiconvex Programming . . . . . . . . . . 196 5.6 Stampacchia Variational Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . 199

5.6.1 Existence Results: The Finite Dimensions Case . . . . . . . . 199 5.6.2 Existence Results: The Infinite Dimensional Case . . . . . . 201

5.7 Existence Result for Quasiconvex Programming . . . . . . . . . . . . . . . . 203 Bibliography . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 204

It is well known that convex functions form an ideal class of functions for minimization purposes. Indeed, if f : Rn → R is a differentiable convex function, then the global minimizers of f over Rn, that is, the solutions of the unconstrained optimization problem

(P) inf x∈Rn

f(x),

are simply the stationary point of f , that is, the point of Rn for which ∇f(x), the gradient of f , is the null vector. This is thus an perfect situation since, usually, this last condition “∇f(x) = 0” is only a necessary one for the point x to be a local minimizer of f .