Online Hyperparameter Optimization for Streaming Neural Networks - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Online Hyperparameter Optimization for Streaming Neural Networks

Résumé

Neural networks have enjoyed tremendous success in many areas over the last decade. They are also receiving more and more attention in learning from data streams, which is inherently incremental. An incremental setting poses challenges for hyperparameter optimization, which is essential to obtain satisfactory network performance. To overcome this challenge, we introduce Continuously Adaptive Neural networks for Data streams (CAND). For every prediction, CAND chooses the current best network from a pool of candidates by continuously monitoring the performance of all candidate networks. The candidates are trained using different optimizers and hyperparameters. An experimental comparison against three state-of-the-art stream learning methods, over 17 benchmark streaming datasets con-firms the competitive performance of CAND, especially on high-dimensional data. We also investigate two orthogonal heuristics for accelerating Cand,which trade-off small amounts of accuracy for significant run-time gains. We observe that training on small mini-batches yields similar accuracy to single-instance fully incremental training, even on evolving data streams.
Fichier non déposé

Dates et versions

hal-04468402 , version 1 (20-02-2024)

Identifiants

Citer

Nuwan Gunasekara, Heitor Murilo Gomes, Bernhard Pfahringer, Albert Bifet. Online Hyperparameter Optimization for Streaming Neural Networks. International Joint Conference on Neural Networks, IJCNN 2022, Padua, Italy, July 18-23, 2022, Jul 2022, Padua, Italy. pp.1--9, ⟨10.1109/IJCNN55064.2022.9891953⟩. ⟨hal-04468402⟩
5 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More