Fully Distributed Deep Neural Network : F2D2N - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Fully Distributed Deep Neural Network : F2D2N

Résumé

Recent advances in Artificial Intelligence (AI) have accelerated the adoption of AI at a pace never seen before. Large Language Models (LLM) trained on tens of billions of parameters show the crucial importance of parallelizing models. Different techniques exist for distributing Deep Neural Networks but they are challenging to implement. The cost of training GPU-based architectures is also becoming prohibitive. In this document we present a distributed approach that is easier to implement where data and model are distributed in processing units hosted on a cluster of machines based on CPUs or GPUs. Communication is done by message passing. The model is distributed over the cluster and stored locally or on a datalake. We prototyped this approach using open sources libraries and we present the benefits this implementation can bring.
Fichier principal
Vignette du fichier
F2D2N.pdf (3.89 Mo) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04435168 , version 1 (02-02-2024)

Identifiants

Citer

Ernesto Leite, Fabrice Mourlin, Pierre Paradinas. Fully Distributed Deep Neural Network : F2D2N. MSPN 2023, Oct 2023, Paris, France. ⟨10.1007/978-3-031-52426-4_15⟩. ⟨hal-04435168⟩
37 Consultations
15 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More