Improving Neural Architecture Search by Mixing a FireFly algorithm with a Training Free Evaluation - Archive ouverte HAL
Communication Dans Un Congrès Année : 2022

Improving Neural Architecture Search by Mixing a FireFly algorithm with a Training Free Evaluation

Résumé

Neural Architecture Search (NAS) algorithms are used to automate the design of deep neural networks. Finding the best architecture for a given dataset can be time consuming since these algorithms have to explore a large number of networks, and score them according to their performances to choose the most appropriate one. In this work, we propose a novel metric that uses the Intra-Cluster Distance (ICD) score to evaluate the ability of an untrained model to distinguish between data in order to approximate its quality. We also use an improved version of the FireFly algorithm, more robust to the local optimums problem than the baseline FireFly algorithm, as a search technique to find the best neural network model adapted to a specific dataset. Experimental results on the different NAS Benchmarks show that our metric is valid for either scoring CNNs and RNNs, and that our proposed FireFly algorithm can improve the result obtained by the state-of-art training-free methods
IJCNNFinalNonEditeur.pdf (1.35 Mo) Télécharger le fichier

Dates et versions

hal-04043366 , version 1 (27-03-2023)

Identifiants

Citer

Nassim Mokhtari, Alexis Nédélec, Marlene Gilles, Pierre de Loor. Improving Neural Architecture Search by Mixing a FireFly algorithm with a Training Free Evaluation. 2022 International Joint Conference on Neural Networks (IJCNN), Jul 2022, Padua, Italy. pp.1-8, ⟨10.1109/IJCNN55064.2022.9892861⟩. ⟨hal-04043366⟩
52 Consultations
103 Téléchargements

Altmetric

Partager

More