Word Order Matters When You Increase Masking - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Word Order Matters When You Increase Masking

Résumé

Word order, an essential property of natural languages, is injected in Transformer-based neural language models using position encoding. However, recent experiments have shown that explicit position encoding is not always useful, since some models without such feature managed to achieve state-of-the art performance on some tasks. To understand better this phenomenon, we examine the effect of removing position encodings on the pre-training objective itself (i.e., masked language modelling), to test whether models can reconstruct position information from co-occurrences alone. We do so by controlling the amount of masked tokens in the input sentence, as a proxy to affect the importance of position information for the task. We find that the necessity of position information increases with the amount of masking, and that masked language models without position encodings are not able to reconstruct this information on the task. These findings point towards a direct relationship between the amount of masking and the ability of Transformers to capture order-sensitive aspects of language using position encoding.
Fichier principal
Vignette du fichier
2022.emnlp-main.118.pdf (326.25 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-04026322 , version 1 (13-03-2023)

Identifiants

  • HAL Id : hal-04026322 , version 1

Citer

Karim Lasri, Thierry Poibeau, Alessandro Lenci. Word Order Matters When You Increase Masking. Conference on Empirical Methods in Natural Language Processing, ACL, Dec 2022, Abu Dhabi, United Arab Emirates. pp.1808-1815. ⟨hal-04026322⟩
6 Consultations
81 Téléchargements

Partager

Gmail Facebook X LinkedIn More