ABSTRACT

In this paper a new globally convergent first–order batch training algorithm is proposed, which is equipped with the sign–based updates of the composite nonlinear Jacobi-Rprop method. It is a Jacobi-Rprop modification that builds on a mathematical framework for the convergence analysis ensuring the direction of search is always a descent one. This approach led to accelerated learning and outperformed a recently proposed modification, the Jacobi-bisection method, in all cases tested.