Language Portability Strategies for Open-domain Dialogue with Pre-trained Language Models from High to Low Resource Languages - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Language Portability Strategies for Open-domain Dialogue with Pre-trained Language Models from High to Low Resource Languages

Ahmed Njifenjou
Virgile Sucal
  • Fonction : Auteur
  • PersonId : 1262393
Bassam Jabaian
Fabrice Lefèvre

Résumé

In this paper we propose a study of linguistic portability strategies of large pre-trained language models (PLMs) used for open-domain dialogue systems in a high-resource language for this task. In particular the target low-resource language (L_T) will be simulated with French, as it lacks of task-specific resources and allows our human evaluation, when the source language (L_S) is English. For obvious reasons, recent works using such models for open-domain dialogue are mostly developed in English. Yet building specific PLMs for each possible target language supposes collecting new datasets and is costly. For this reason, trying to leverage all existing resources (PLMs and data) in both L_S and L_T , we wish to assess the performance achievable in L_T with different approaches. The first two approaches evaluate the usage of Neural Machine Translation (NMT) at different levels: TrainOnTarget where a L_S dataset is translated before fine-tuning in L_T and TestOnSource where a L_S model is coupled with NMT modules during inference. Then, the advent of BLOOM [2], the world first open-access multilingual large PLM, allow researchers to develop new approaches aiming to leverage not only the model’s full accessibility but also its multilingualism and translation abilities. In this context the task is learned in L_S first and adapted to L_T using the MAD-X Adapter architecture [16]. In the two sets of experiments models are evaluated in spoken dialogue conditions with human and the strategies can be compared in terms of perceived interaction quality.
Fichier principal
Vignette du fichier
IWSDS_2023_hal.pdf (660.6 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04631021 , version 1 (01-07-2024)

Identifiants

Citer

Ahmed Njifenjou, Virgile Sucal, Bassam Jabaian, Fabrice Lefèvre. Language Portability Strategies for Open-domain Dialogue with Pre-trained Language Models from High to Low Resource Languages. The 13th International Workshop on Spoken Dialogue Systems Technology (IWSDS '23), Feb 2023, Los Angeles, United States. ⟨hal-04631021⟩
0 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Mastodon Facebook X LinkedIn More