Explainable Hyperparameters Optimization using Hilbert-Schmidt Independence Criterion
Résumé
Tackling new machine learning problems with neural networks always means optimizing numerous hyperparameters that define their structure and strongly impact their performances. In this work, we use a robust system conception approach to build explainable hyperparameters optimization. This approach is defined by the research of a parametrization of a system (a neural network) to optimize its output response (the prediction error) to an input stimulation (the test set). To that end, we study the use of Hilbert-Schmidt Independence Criterion (HSIC), a probability distribution dependence measure widely used for sensitivity analysis in robust system conception, in the context of Hyperparameters Optimization. Hyperparameters spaces can be complex and awkward, with different natures of hyperparameters (categorical, discrete, boolean, continuous), interactions and inter dependencies, which makes it non trivial to apply HSIC. We alleviate these difficulties to make HSIC applicable in that context and obtain an analysis tool that quantifies the relative impact of hyperparameters on a Neural Network's final error. Notably, we show how this knowledge allows obtaining competitive neural networks that are naturally much more cost effective.
Origine | Fichiers produits par l'(les) auteur(s) |
---|