Parallel Multi-Objective Hyperparameter Optimization with Uniform Normalization and Bounded Objectives - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

Parallel Multi-Objective Hyperparameter Optimization with Uniform Normalization and Bounded Objectives

Résumé

Machine learning (ML) methods offer a wide range of configurable hyperparameters that have a significant influence on their performance. While accuracy is a commonly used performance objective, in many settings, it is not sufficient. Optimizing the ML models with respect to multiple objectives such as accuracy, confidence, fairness, calibration, privacy, latency, and memory consumption is becoming crucial. To that end, hyperparameter optimization, the approach to systematically optimize the hyperparameters, which is already challenging for a single objective, is even more challenging for multiple objectives. In addition, the differences in objective scales, the failures, and the presence of outlier values in objectives make the problem even harder. We propose a multi-objective Bayesian optimization (MoBO) algorithm that addresses these problems through uniform objective normalization and randomized weights in scalarization. We increase the efficiency of our approach by imposing constraints on the objective to avoid exploring unnecessary configurations (e.g., insufficient accuracy). Finally, we leverage an approach to parallelize the MoBO which results in a 5x speed-up when using 16x more workers.

Dates et versions

hal-04219318 , version 1 (27-09-2023)

Identifiants

Citer

Romain Egele, Tyler Chang, Yixuan Sun, Venkatram Vishwanath, Prasanna Balaprakash. Parallel Multi-Objective Hyperparameter Optimization with Uniform Normalization and Bounded Objectives. 2023. ⟨hal-04219318⟩
18 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More