Noisy Optimization: Fast Convergence Rates with Comparison-Based Algorithms - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2016

Noisy Optimization: Fast Convergence Rates with Comparison-Based Algorithms

Marie-Liesse Cauwet
  • Fonction : Auteur
  • PersonId : 961058
Olivier Teytaud

Résumé

Derivative Free Optimization is known to be an efficient and robust method to tackle the black-box optimization problem. When it comes to noisy functions, classical comparison-based algorithms are slower than gradient-based algorithms. For quadratic functions, Evolutionary Algorithms without large mutations have a simple regret at best $O(1/ \sqrt{N})$ when $N$ is the number of function evaluations, whereas stochastic gradient descent can reach (tightly) a simple regret in $O(1/N)$. It has been conjectured that gradient approximation by finite differences (hence, not a comparison-based method) is necessary for reaching such a $O(1/N)$. We answer this conjecture in the negative, providing a comparison-based algorithm as good as gradient methods, i.e. reaching $O(1/N)$ - under the condition, however, that the noise is Gaussian. Experimental results confirm the $O(1/N)$ simple regret, i.e., squared rate compared to many published results at $O(1/\sqrt{N})$.
Fichier principal
Vignette du fichier
mca.pdf (194.4 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01306636 , version 1 (25-04-2016)
hal-01306636 , version 2 (27-04-2016)

Identifiants

Citer

Marie-Liesse Cauwet, Olivier Teytaud. Noisy Optimization: Fast Convergence Rates with Comparison-Based Algorithms. Genetic and Evolutionary Computation Conference, Jul 2016, Denver, United States. pp.1101-1106. ⟨hal-01306636v2⟩
285 Consultations
486 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More