STABILISING AND ACCELERATING LIGHT GATED RECURRENT UNITS FOR AUTOMATIC SPEECH RECOGNITION - Archive ouverte HAL Accéder directement au contenu
Pré-Publication, Document De Travail Année : 2023

STABILISING AND ACCELERATING LIGHT GATED RECURRENT UNITS FOR AUTOMATIC SPEECH RECOGNITION

Résumé

The light gated recurrent units (Li-GRU) is well-known for achieving impressive results in automatic speech recognition (ASR) tasks while being lighter and faster to train than a standard gated recurrent units (GRU). However, the unbounded nature of its rectified linear unit on the candidate recurrent gate induces an important gradient exploding phenomenon disrupting the training process and preventing it from being applied to famous datasets. In this paper, we theoretically and empirically derive the necessary conditions for its stability as well as engineering mechanisms to speed up by a factor of five its training time, hence introducing a novel version of this architecture named SLi-GRU. Then, we evaluate its performance both on a toy task illustrating its newly acquired capabilities and a set of three different ASR datasets demonstrating lower word error rates compared to more complex recurrent neural networks.
Fichier principal
Vignette du fichier
Stabilising_and_accelerating_light_gated_recurrent_units_for_automatic_speech_recognition-2.pdf (171.33 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03993123 , version 1 (16-02-2023)

Licence

Copyright (Tous droits réservés)

Identifiants

Citer

Adel Moumen, Titouan Parcollet. STABILISING AND ACCELERATING LIGHT GATED RECURRENT UNITS FOR AUTOMATIC SPEECH RECOGNITION. 2023. ⟨hal-03993123⟩

Collections

UNIV-AVIGNON LIA
22 Consultations
13 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More