Safe Grid Search with Optimal Complexity - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Safe Grid Search with Optimal Complexity

Résumé

Popular machine learning estimators involve regularization parameters that can be challenging to tune, and standard strategies rely on grid search for this task. In this paper, we revisit the techniques of approximating the regularization path up to predefined tolerance $\epsilon$ in a unified framework and show that its complexity is $O(1/\sqrt[d]{\epsilon})$ for uniformly convex loss of order $d>0$ and $O(1/\sqrt{\epsilon})$ for Generalized Self-Concordant functions. This framework encompasses least-squares but also logistic regression (a case that as far as we know was not handled as precisely by previous works). We leverage our technique to provide refined bounds on the validation error as well as a practical algorithm for hyperparameter tuning. The later has global convergence guarantee when targeting a prescribed accuracy on the validation set. Last but not least, our approach helps relieving the practitioner from the (often neglected) task of selecting a stopping criterion when optimizing over the training set: our method automatically calibrates it based on the targeted accuracy on the validation set.
Fichier principal
Vignette du fichier
1810.05471.pdf (754.25 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01900037 , version 1 (20-10-2018)

Identifiants

Citer

Eugene Ndiaye, Tam Le, Olivier Fercoq, Joseph Salmon, Ichiro Takeuchi. Safe Grid Search with Optimal Complexity. International Conference on Machine Learning, 2019, Long Beach, United States. ⟨hal-01900037⟩
87 Consultations
136 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More