Communication Dans Un Congrès Année : 2024

DataStates-LLM: Lazy Asynchronous Checkpointing for Large Language Models

Robert Underwood
M Mustafa Rafique
Franck Cappello
  • Fonction : Auteur
  • PersonId : 1102088
Bogdan Nicolae

Résumé

LLMs have seen rapid adoption in all domains. They need to be trained on high-end high-performance computing (HPC) infrastructures and ingest massive amounts of input data. Unsurprisingly, at such a large scale, unexpected events (e.g., failures of components, instability of the software, undesirable learning patterns, etc.), are frequent and typically impact the training in a negative fashion. Thus, LLMs need to be checkpointed frequently so that they can be rolled back to a stable state and subsequently fine-tuned. However, given the large sizes of LLMs, a straightforward checkpointing solution that directly writes the model parameters and optimizer state to persistent storage (e.g., a parallel file system), incurs significant I/O overheads. To address this challenge, in this paper we study how to reduce the I/O overheads for enabling fast and scalable checkpointing for LLMs that can be applied at high frequency (up to the granularity of individual iterations) without significant impact on the training process. Specifically, we introduce a lazy asynchronous multi-level approach that takes advantage of the fact that the tensors making up the model and optimizer state shards remain immutable for extended periods of time, which makes it possible to copy their content in the background with minimal interference during the training process. We evaluate our approach at scales of up to 180 GPUs using different model sizes, parallelism settings, and checkpointing frequencies. The results show up to 4x faster checkpointing and 2.2$x faster end-to-end training runtime compared with the state-of-art checkpointing approaches.
Fichier principal
Vignette du fichier
Async_LLM_Checkpointing__Avinash_HPDC_24_.pdf (1.52 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04614247 , version 1 (17-06-2024)

Licence

Identifiants

Citer

Avinash Maurya, Robert Underwood, M Mustafa Rafique, Franck Cappello, Bogdan Nicolae. DataStates-LLM: Lazy Asynchronous Checkpointing for Large Language Models. HPDC'24: 33nd International Symposium on High-Performance Parallel and Distributed Computing, Jun 2024, Pisa (IT), Italy. ⟨10.1145/3625549.3658685⟩. ⟨hal-04614247⟩
111 Consultations
125 Téléchargements

Altmetric

Partager

More