Adaptive black-box optimization got easier: HCT only needs local smoothness - Archive ouverte HAL Access content directly
Conference Papers Year : 2018

Adaptive black-box optimization got easier: HCT only needs local smoothness

Xuedong Shang
Emilie Kaufmann
Michal Valko

Abstract

Hierarchical bandits is an approach for global optimization of extremely irregular functions. This paper provides new elements regarding POO, an adaptive meta-algorithm that does not require the knowledge of local smoothness of the target function. We first highlight the fact that the subroutine algorithm used in POO should have a small regret under the assumption of local smoothness with respect to the chosen partitioning, which is unknown if it is satisfied by the standard subroutine HOO. In this work, we establish such regret guarantee for HCT, which is another hierarchical optimistic optimization algorithm that needs to know the smoothness. This confirms the validity of POO. We show that POO can be used with HCT as a subroutine with a regret upper bound that matches the one of best-known algorithms using the knowledge of smoothness up to a √ log n factor.
Fichier principal
Vignette du fichier
shang2018adaptive.pdf (628.48 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01874637 , version 1 (14-09-2018)

Identifiers

  • HAL Id : hal-01874637 , version 1

Cite

Xuedong Shang, Emilie Kaufmann, Michal Valko. Adaptive black-box optimization got easier: HCT only needs local smoothness. European Workshop on Reinforcement Learning, Oct 2018, Lille, France. ⟨hal-01874637⟩
206 View
202 Download

Share

Gmail Facebook Twitter LinkedIn More