Narcissist! Do you need so much attention? - Archive ouverte HAL
Communication Dans Un Congrès Année : 2021

Narcissist! Do you need so much attention?

Résumé

After the rise of Word2vec came the BERT era, with large architectures allowing to deal with polysemy by taking into account the contextual information, lead- ing to great performance improvement on classic nlp tasks. BERT systems are considered universal: they can be fine-tuned to address any task efficiently. How- ever, these systems are huge to deploy, not trivial to fine-tune, and may not be fitted to some corpora, e.g. domain-specific and small ones. For instance, we con- sider the deft 2018 corpus of tweets and show that CamemBERT is not appropriate to this corpus and task. According to the Occam’s razor principle, we thus de- signed MiniBERT, a tiny BERT architecture that includes a simplified self-attention mechanism and does require neither pre-training, nor external data. We show that this easily trainable and deployable system obtains encouraging results on deft, whilst providing inter- pretable results.
Fichier principal
Vignette du fichier
CAp2021(1).pdf (343.71 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03236703 , version 1 (14-06-2021)

Identifiants

  • HAL Id : hal-03236703 , version 1

Citer

Gaëtan Caillaut, Nicolas Dugué, Nathalie Camelin. Narcissist! Do you need so much attention?. CAP : Conférence sur l'Apprentisage automatique, Jun 2021, Saint-Etienne (en distanciel), France. ⟨hal-03236703v1⟩

Collections

UNIV-LEMANS LIUM
99 Consultations
124 Téléchargements

Partager

More