ABSTRACT

Stochastic methods have gained some popularity in global optimization in that most of them do not assume the cost functions to be differentiable, have capabilities to avoid being trapped by local optima, and may converge even faster than gradient-based optimization methods on some problems. The present paper proposes an optimization method which reduces the search space by means of densification curves, coupled with the Dynamic Canonical Descent Algorithm. The performances of the new method are shown on several known problems classically used for testing optimization algorithms, and proved to outperform famous algorithms such as Simulated Annealing and Genetic Algorithms.