Lightweight Adapter Tuning for Multilingual Speech Translation - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2021

Lightweight Adapter Tuning for Multilingual Speech Translation

Résumé

Adapter modules were recently introduced as an efficient alternative to fine-tuning in NLP. Adapter tuning consists in freezing pretrained parameters of a model and injecting lightweight modules between layers, resulting in the addition of only a small number of taskspecific trainable parameters. While adapter tuning was investigated for multilingual neural machine translation, this paper proposes a comprehensive analysis of adapters for multilingual speech translation (ST). Starting from different pre-trained models (a multilingual ST trained on parallel data or a multilingual BART (mBART) trained on non-parallel multilingual data), we show that adapters can be used to: (a) efficiently specialize ST to specific language pairs with a low extra cost in terms of parameters, and (b) transfer from an automatic speech recognition (ASR) task and an mBART pretrained model to a multilingual ST task. Experiments show that adapter tuning offer competitive results to full fine-tuning, while being much more parameter-efficient.
Fichier principal
Vignette du fichier
adapting_multilingual_st-acl2021.pdf (192.49 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03294912 , version 1 (21-07-2021)

Identifiants

  • HAL Id : hal-03294912 , version 1

Citer

Hang Le, Juan Pino, Changhan Wang, Jiatao Gu, Didier Schwab, et al.. Lightweight Adapter Tuning for Multilingual Speech Translation. The Joint Conference of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (ACL-IJCNLP 2021), Aug 2021, Bangkok (Virtual), Thailand. ⟨hal-03294912⟩
78 Consultations
193 Téléchargements

Partager

Gmail Facebook X LinkedIn More