The Impact of Hyper-Parameter Tuning for Landscape-Aware Performance Regression and Algorithm Selection
Résumé
Automated algorithm selection and configuration methods that build on exploratory landscape analysis (ELA) are becoming very popular in Evolutionary Computation. However, despite a significantly growing number of applications, the underlying machine learning models are often chosen in an ad-hoc manner.
We show in this work that three classical regression methods are able to achieve meaningful results for ELA-based algorithm selection. For those three models-random forests, decision trees, and bagging decision trees-the quality of the regression models is highly impacted by the chosen hyper-parameters. This has significant effects also on the quality of the algorithm selectors that are built on top of these regressions.
By comparing a total number of 30 different models, each coupled with 2 complementary regression strategies, we derive guidelines for the tuning of the regression models and provide general recommendations for a more systematic use of classical machine learning models in landscape-aware algorithm selection. We point out that a choice of the machine learning model merits to be carefully undertaken and further investigated.
Domaines
Réseau de neurones [cs.NE]
Fichier principal
Jankovic.Popovski.Eftimov.Doerr GECCO 2104.09272.pdf (2.62 Mo)
Télécharger le fichier
Origine | Fichiers produits par l'(les) auteur(s) |
---|