Why globally re-shuffle? Revisiting data shuffling in large scale deep learning - Archive ouverte HAL
Communication Dans Un Congrès Année : 2022

Why globally re-shuffle? Revisiting data shuffling in large scale deep learning

Résumé

Stochastic gradient descent (SGD) is the most prevalent algorithm for training Deep Neural Networks (DNN). SGD iterates the input data set in each training epoch processing data samples in a random access fashion. Because this puts enormous pressure on the I/O subsystem, the most common approach to distributed SGD in HPC environments is to replicate the entire dataset to node local SSDs. However, due to rapidly growing data set sizes this approach has become increasingly infeasible. Surprisingly, the questions of why and to what extent random access is required have not received a lot of attention in the literature from an empirical standpoint. In this paper, we revisit data shuffling in DL workloads to investigate the viability of partitioning the dataset among workers and performing only a partial distributed exchange of samples in each training epoch. Through extensive experiments on up to 2,048 GPUs of ABCI and 4,096 compute nodes of Fugaku, we demonstrate that in practice validation accuracy of global shuffling can be maintained when carefully tuning the partial distributed exchange. We provide a solution implemented in PyTorch that enables users to control the proposed data exchange scheme.
Fichier principal
Vignette du fichier
main.pdf (2.32 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03599740 , version 1 (07-03-2022)

Identifiants

Citer

Thao Truong Nguyen, François Trahay, Jens Domke, Aleksandr Drozd, Emil Vatai, et al.. Why globally re-shuffle? Revisiting data shuffling in large scale deep learning. IPDPS 2022: 36th International Parallel & Distributed Processing Symposium, May 2022, Lyon (virtual), France. pp.1-12, ⟨10.1109/IPDPS53621.2022.00109⟩. ⟨hal-03599740⟩
223 Consultations
1446 Téléchargements

Altmetric

Partager

More