Divide and Rule: Training Context-Aware Multi-Encoder Translation Models with Little Resources - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2021

Divide and Rule: Training Context-Aware Multi-Encoder Translation Models with Little Resources

Résumé

Multi-encoder models are a broad family of context-aware Neural Machine Translation (NMT) systems that aim to improve translation quality by encoding document-level contextual information alongside the current sentence. The context encoding is undertaken by contextual parameters, trained on documentlevel data. In this work, we show that training these parameters takes large amount of data, since the contextual training signal is sparse. We propose an efficient alternative, based on splitting sentence pairs, that allows to enrich the training signal of a set of parallel sentences by breaking intra-sentential syntactic links, and thus frequently pushing the model to search the context for disambiguating clues. We evaluate our approach with BLEU and contrastive test sets, showing that it allows multi-encoder models to achieve comparable performances to a setting where they are trained with ×10 document-level data. We also show that our approach is a viable option to context-aware NMT for language pairs with zero document-level parallel data.
Fichier principal
Vignette du fichier
2103.17151.pdf (381.26 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03455113 , version 1 (29-11-2021)

Identifiants

  • HAL Id : hal-03455113 , version 1

Citer

Lorenzo Lupo, Marco Dinarelli, Laurent Besacier. Divide and Rule: Training Context-Aware Multi-Encoder Translation Models with Little Resources. 2021. ⟨hal-03455113⟩
52 Consultations
36 Téléchargements

Partager

Gmail Facebook X LinkedIn More