Unwritten Languages Demand Attention Too! Word Discovery with Encoder-Decoder Models - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2017

Unwritten Languages Demand Attention Too! Word Discovery with Encoder-Decoder Models

Résumé

Word discovery is the task of extracting words from un-segmented text. In this paper we examine to what extent neu-ral networks can be applied to this task in a realistic unwritten language scenario, where only small corpora and limited annotations are available. We investigate two scenarios: one with no supervision and another with limited supervision with access to the most frequent words. Obtained results show that it is possible to retrieve at least 27% of the gold standard vocabulary by training an encoder-decoder neural machine translation system with only 5,157 sentences. This result is close to those obtained with a task-specific Bayesian nonparametric model. Moreover, our approach has the advantage of generating translation alignments, which could be used to create a bilingual lexicon. As a future perspective, this approach is also well suited to work directly from speech.
Fichier principal
Vignette du fichier
Template.pdf (309.54 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01592091 , version 1 (22-09-2017)

Identifiants

  • HAL Id : hal-01592091 , version 1

Citer

Marcely Zanon Boito, Alexandre Bérard, Aline Villavicencio, Laurent Besacier. Unwritten Languages Demand Attention Too! Word Discovery with Encoder-Decoder Models. IEEE Automatic Speech Recognition and Understanding (ASRU), Dec 2017, Okinawa, Japan. ⟨hal-01592091⟩
325 Consultations
155 Téléchargements

Partager

Gmail Facebook X LinkedIn More