A Divergence-Based Condition to Ensure Quantile Improvement in Black-Box Global Optimization
Résumé
Black-box global optimization aims at minimizing an objective function whose analytical form is not known. To do so, many state-of-the-art methods rely on sampling-based strategies, where sampling distributions are built in an iterative fashion, so that their mass concentrate where the objective function is low. Despite empirical success, the theoretical study of these methods remains difficult. In this work, we introduce a new framework, based on divergence-decrease conditions, to study and design black-box global optimization algorithms. Our approach allows to establish and quantify the improvement of sampling distributions at each iteration, in terms of expected value or quantile of the objective. We show that the information-geometric optimization approach fits within our framework, yielding a new approach for its analysis. We also establish sampling distribution improvement results for two novel algorithms, one related with the cross-entropy approach with mixture models, and another one using heavy-tailed sampling distributions.
Origine | Fichiers produits par l'(les) auteur(s) |
---|---|
Licence |