A Divergence-Based Condition to Ensure Quantile Improvement in Black-Box Global Optimization - Archive ouverte HAL
Article Dans Une Revue IEEE Transactions on Evolutionary Computation Année : 2024

A Divergence-Based Condition to Ensure Quantile Improvement in Black-Box Global Optimization

Résumé

Black-box global optimization aims at minimizing an objective function whose analytical form is not known. To do so, many state-of-the-art methods rely on sampling-based strategies, where sampling distributions are built in an iterative fashion, so that their mass concentrate where the objective function is low. Despite empirical success, the theoretical study of these methods remains difficult. In this work, we introduce a new framework, based on divergence-decrease conditions, to study and design black-box global optimization algorithms. Our approach allows to establish and quantify the improvement of sampling distributions at each iteration, in terms of expected value or quantile of the objective. We show that the information-geometric optimization approach fits within our framework, yielding a new approach for its analysis. We also establish sampling distribution improvement results for two novel algorithms, one related with the cross-entropy approach with mixture models, and another one using heavy-tailed sampling distributions.
Fichier principal
Vignette du fichier
divergenceConditionNeutral.pdf (365.99 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

hal-04711649 , version 1 (27-09-2024)

Licence

Identifiants

Citer

Thomas Guilmeau, Emilie Chouzenoux, Víctor Elvira. A Divergence-Based Condition to Ensure Quantile Improvement in Black-Box Global Optimization. IEEE Transactions on Evolutionary Computation, 2024, pp.1-1. ⟨10.1109/TEVC.2024.3452420⟩. ⟨hal-04711649⟩
28 Consultations
7 Téléchargements

Altmetric

Partager

More