Embedding strategies for specialized domains: application to clinical entity recognition - Archive ouverte HAL
Communication Dans Un Congrès Année : 2019

Embedding strategies for specialized domains: application to clinical entity recognition

Résumé

Using pre-trained word embeddings in conjunction with Deep Learning models has become the de facto approach in Natural Language Processing (NLP). While this usually yields satisfactory results, off-the-shelf word embeddings tend to perform poorly on texts from specialized domains such as clinical reports. Moreover, training specialized word representations from scratch is often either impossible or ineffective due to the lack of large enough in-domain data. In this work, we focus on the clinical domain for which we study embedding strategies that rely on general-domain resources only. We show that by combining off-the-shelf contextual embeddings (ELMo) with static word2vec embeddings trained on a small in-domain corpus built from the task data, we manage to reach and sometimes outperform representations learned from a large corpus in the medical domain.
Fichier principal
Vignette du fichier
P19-2041.pdf (331.71 Ko) Télécharger le fichier
Origine Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-02860947 , version 1 (13-11-2024)

Identifiants

Citer

Hicham El Boukkouri, Olivier Ferret, Thomas Lavergne, Pierre Zweigenbaum. Embedding strategies for specialized domains: application to clinical entity recognition. ACL 2019 - 57th Annual Meeting of the Association for Computational Linguistics: Student Research Workshop, Jul 2019, Florence, Italy. pp.295-301, ⟨10.18653/v1/P19-2041⟩. ⟨hal-02860947⟩
71 Consultations
4 Téléchargements

Altmetric

Partager

More