Investigating the dynamics of hand and lips in French Cued Speech using attention mechanisms and CTC-based decoding - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Investigating the dynamics of hand and lips in French Cued Speech using attention mechanisms and CTC-based decoding

Résumé

Hard of hearing or profoundly deaf people make use of cued speech (CS) as a communication tool to understand spoken language. By delivering cues that are relevant to the phonetic information, CS offers a way to enhance lipreading. In literature, there have been several studies on the dynamics between the hand and the lips in the context of human production. This article proposes a way to investigate how a neural network learns this relation for a single speaker while performing a recognition task using attention mechanisms. Further, an analysis of the learnt dynamics is utilized to establish the relationship between the two modalities and extract automatic segments. For the purpose of this study, a new dataset has been recorded for French CS. Along with the release of this dataset, a benchmark will be reported for word-level recognition, a novelty in the automatic recognition of French CS.
Fichier principal
Vignette du fichier
sankar_interspeech23.pdf (445.61 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04126530 , version 1 (13-06-2023)

Identifiants

Citer

Sanjana Sankar, Denis Beautemps, Frédéric Elisei, Olivier Perrotin, Thomas Hueber. Investigating the dynamics of hand and lips in French Cued Speech using attention mechanisms and CTC-based decoding. Interspeech 2023 - 24th Annual Conference of the International Speech Communication Association, ISCA, Aug 2023, Dublin, Ireland. ⟨hal-04126530⟩
42 Consultations
15 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More