Asynchronous Decentralized Bayesian Optimization for Large Scale Hyperparameter Optimization - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Asynchronous Decentralized Bayesian Optimization for Large Scale Hyperparameter Optimization

Résumé

Bayesian optimization (BO) is a promising approach for hyperparameter optimization of deep neural networks (DNNs), where each model training can take minutes to hours. In BO, a computationally cheap surrogate model is employed to learn the relationship between parameter configurations and their performance such as accuracy. Parallel BO methods often adopt single manager/multiple workers strategies to evaluate multiple hyperparameter configurations simultaneously. Despite significant hyperparameter evaluation time, the overhead in such centralized schemes prevents these methods to scale on a large number of workers. We present an asynchronous-decentralized BO, wherein each worker runs a sequential BO and asynchronously communicates its results through shared storage. We scale our method without loss of computational efficiency with above 95% of worker's utilization to 1,920 parallel workers (full production queue of the Polaris supercomputer) and demonstrate improvement in model accuracy as well as faster convergence on the CANDLE benchmark from the Exascale computing project.

Dates et versions

hal-04219312 , version 1 (27-09-2023)

Identifiants

Citer

Romain Egele, Isabelle Guyon, Venkatram Vishwanath, Prasanna Balaprakash. Asynchronous Decentralized Bayesian Optimization for Large Scale Hyperparameter Optimization. eS 2023 - 19th IEEE International Conference on e-Science, Oct 2023, Limassol (Chypre), Cyprus. ⟨hal-04219312⟩
56 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More