Investigating Adaptation and Transfer Learning for End-to-End Spoken Language Understanding from Speech - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2019

Investigating Adaptation and Transfer Learning for End-to-End Spoken Language Understanding from Speech

Résumé

This work investigates speaker adaptation and transfer learning for spoken language understanding (SLU). We focus on the direct extraction of semantic tags from the audio signal using an end-to-end neural network approach. We demonstrate that the learning performance of the target predictive function for the semantic slot filling task can be substantially improved by speaker adaptation and by various knowledge transfer approaches. First, we explore speaker adaptive training (SAT) for end-to-end SLU models and propose to use zero pseudo i-vectors for more efficient model initialization and pretraining in SAT. Second, in order to improve the learning convergence for the target semantic slot filling (SF) task, models trained for different tasks, such as automatic speech recognition and named entity extraction are used to initialize neural end-to-end models trained for the target task. In addition, we explore the impact of the knowledge transfer for SLU from a speech recognition task trained in a different language. These approaches allow to develop end-to-end SLU systems in low-resource data scenarios when there is no enough in-domain semantically labeled data, but other resources, such as word transcriptions for the same or another language or named entity annotation, are available.
Fichier principal
Vignette du fichier
2158.pdf (392.78 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte
Loading...

Dates et versions

hal-02307811 , version 1 (08-10-2019)

Identifiants

Citer

Natalia Tomashenko, Antoine Caubrière, Yannick Estève. Investigating Adaptation and Transfer Learning for End-to-End Spoken Language Understanding from Speech. Interspeech 2019, Sep 2019, Graz, Austria. pp.824-828, ⟨10.21437/Interspeech.2019-2158⟩. ⟨hal-02307811⟩
532 Consultations
421 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More