Pre-training for Speech Translation: CTC Meets Optimal Transport - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Pre-training for Speech Translation: CTC Meets Optimal Transport

Résumé

The gap between speech and text modalities is a major challenge in speech-to-text translation (ST). Different methods have been proposed to reduce this gap, but most of them require architectural changes in ST training. In this work, we propose to mitigate this issue at the pre-training stage, requiring no change in the ST model. First, we show that the connectionist temporal classification (CTC) loss can reduce the modality gap by design. We provide a quantitative comparison with the more common cross-entropy loss, showing that pre-training with CTC consistently achieves better final ST accuracy. Nevertheless, CTC is only a partial solution and thus, in our second contribution, we propose a novel pre-training method combining CTC and optimal transport to further reduce this gap. Our method pre-trains a Siamese-like model composed of two encoders, one for acoustic inputs and the other for textual inputs, such that they produce representations that are close to each other in the Wasserstein space. Extensive experiments on the standard CoVoST-2 and MuST-C datasets show that our pre-training method applied to the vanilla encoder-decoder Transformer achieves state-ofthe-art performance under the no-external-data setting, and performs on par with recent strong multi-task learning systems trained with external data. Finally, our method can also be applied on top of these multi-task systems, leading to further improvements for these models.
Fichier principal
Vignette du fichier
2023_ICML_pretraining_ctc_ot.pdf (444.26 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Licence : CC BY - Paternité

Dates et versions

hal-04117237 , version 1 (05-06-2023)

Licence

Paternité

Identifiants

  • HAL Id : hal-04117237 , version 1

Citer

Phuong-Hang Le, Hongyu Gong, Changhan Wang, Juan Pino, Benjamin Lecouteux, et al.. Pre-training for Speech Translation: CTC Meets Optimal Transport. International Conference on Machine Learning (ICML), 2023, Jul 2023, Honolulu, Hawaii, United States. ⟨hal-04117237⟩
63 Consultations
22 Téléchargements

Partager

Gmail Facebook X LinkedIn More