Parallel Hyperparameter Optimization Of Spiking Neural Networks - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail (Preprint/Prepublication) Année : 2023

Parallel Hyperparameter Optimization Of Spiking Neural Networks

Résumé

Hyperparameter optimization of spiking neural networks (SNNs) is a difficult task which has not yet been deeply investigated in the literature. In this work, we designed a scalable constrained Bayesian based optimization algorithm that prevents sampling in non-spiking areas of an efficient high dimensional search space. These search spaces contain infeasible solutions that output no or only a few spikes during the training or testing phases, we call such a mode a "silent network". Finding them is difficult, as many hyperparameters are highly correlated to the architecture and to the dataset. We leverage silent networks by designing a spike-based early stopping criterion to accelerate the optimization process of SNNs trained by Spike Timing Dependent Plasticity (STDP) and surrogate gradient. We parallelized the optimization algorithm asynchronously, and ran large-scale experiments on heterogeneous multi-GPU Petascale architecture. Results show that by considering silent networks, we can design more flexible high-dimensional search spaces while maintaining a good efficacy. The optimization algorithm was able to focus on networks with high performances by preventing costly and worthless computation of silent networks.
Fichier principal
Vignette du fichier
snn_silence_arxiv.pdf (1.51 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04464394 , version 1 (01-03-2024)

Identifiants

Citer

Thomas Firmin, Pierre Boulet, El-Ghazali Talbi. Parallel Hyperparameter Optimization Of Spiking Neural Networks. 2023. ⟨hal-04464394⟩
17 Consultations
27 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More