Neural Architecture Tuning: A BO-Powered NAS Tool - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2024

Neural Architecture Tuning: A BO-Powered NAS Tool

Résumé

Neural Architecture Search (NAS) consists of applying an optimization technique to find the best performing architecture(s) in a defined search space, with regard to an objective function. The practical implementation of NAS currently carries certain limitations, including prohibitive costs with the need for a large number of evaluations, an inflexibility in defining the search space by often having to select from a limited set of possible design components, and a difficulty of integrating existing architecture code by requiring a specialized design language for search space specification. We propose a simplified search tool, with efficiency in the number of evaluations needed to achieve good results, and flexibility by design, allowing for an easy and open definition of the search space and objective function. Interoperability with existing code or newly released architectures from the literature allows the user to quickly and easily tune architectures to produce well-performing solutions tailor-made for particular use cases. We practically apply this tool to certain vision search spaces, and showcase its effectiveness.
Fichier non déposé

Dates et versions

hal-04540630 , version 1 (10-04-2024)

Identifiants

  • HAL Id : hal-04540630 , version 1

Citer

Houssem Ouertatani, Cristian Maxim, Smail Niar, El-Ghazali Talbi. Neural Architecture Tuning: A BO-Powered NAS Tool. International Conference in Optimization and Learning (OLA), May 2024, Dubrovnik, Croatia. ⟨hal-04540630⟩
0 Consultations
0 Téléchargements

Partager

Gmail Facebook X LinkedIn More