Properties of the sign gradient descent algorithms
Résumé
The aim of this article is to study the properties of the sign gradient descent algorithms involving the sign of the gradient instead of the gradient itself and first introduced in the RPROP algorithm. This article provides two results of convergence for local optimization, a first one for nominal systems without uncertainty and a second one for systems with uncertainties. New sign gradient descent algorithms including the dichotomy algorithm DICHO are applied on several examples to show their effectiveness in terms of speed of convergence. As a novelty, the sign gradient descent algorithms can allow to converge in practice towards other minima than the closest minimum of the initial condition making these algorithms suitable for global optimization as a new metaheuristic method.
Fichier principal
Properties of the sign gradient descent algorithms.pdf (1.26 Mo)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|
Loading...