ABSTRACT

Stochastic optimization is based on the concept of random search. Finding an optimum for a given function, thus, implies testing different solutions for the decision variables and evaluating the objective function for each of those test points until a good solution is found. Stochastic programming is a subset of optimization problems in which algebraic constraints with uncertainties are involved. Stochastic and deterministic methods are usually considered as opposite approaches for optimization because of the differences on the basis from which the method is developed. The aforementioned generalities about stochastic optimization are valid for the solution of unconstrained problems. Genetic algorithms are among the first developed stochastic optimization methods. The genetic operator of crossover is applied to all the selected individuals. The number of generations must be tuned to ensure that the algorithm converges to a region close to the global optimum. Differential evolution is an evolutionary method.