DrBERT: A Robust Pre-trained Model in French for Biomedical and Clinical domains - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

DrBERT: A Robust Pre-trained Model in French for Biomedical and Clinical domains

Résumé

In recent years, pre-trained language models (PLMs) achieve the best performance on a wide range of natural language processing (NLP) tasks. While the first models were trained on general domain data, specialized ones have emerged to more effectively treat specific domains. In this paper, we propose an original study of PLMs in the medical domain on French language. We compare, for the first time, the performance of PLMs trained on both public data from the web and private data from healthcare establishments. We also evaluate different learning strategies on a set of biomedical tasks. In particular, we show that we can take advantage of already existing biomedical PLMs in a foreign language by further pre-train it on our targeted data. Finally, we release the first specialized PLMs for the biomedical field in French, called DrBERT, as well as the largest corpus of medical data under free license on which these models are trained.
Fichier principal
Vignette du fichier
2304.00958v2.pdf (462.14 Ko) Télécharger le fichier
DrBERT.pdf (410.04 Ko) Télécharger le fichier
SLIDES-DrBERT.pdf (2.05 Mo) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Origine Fichiers produits par l'(les) auteur(s)
Licence
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

hal-04056658 , version 1 (14-07-2023)

Licence

Identifiants

  • HAL Id : hal-04056658 , version 1

Citer

Yanis Labrak, Adrien Bazoge, Richard Dufour, Mickael Rouvier, Emmanuel Morin, et al.. DrBERT: A Robust Pre-trained Model in French for Biomedical and Clinical domains. 61th Annual Meeting of the Association for Computational Linguistics (ACL'23), Jul 2023, Toronto, Canada. ⟨hal-04056658⟩
268 Consultations
93 Téléchargements

Partager

More