Unwritten Languages Demand Attention Too! Word Discovery with Encoder-Decoder Models - Archive ouverte HAL Access content directly
Conference Papers Year : 2017

Unwritten Languages Demand Attention Too! Word Discovery with Encoder-Decoder Models

Abstract

Word discovery is the task of extracting words from un-segmented text. In this paper we examine to what extent neu-ral networks can be applied to this task in a realistic unwritten language scenario, where only small corpora and limited annotations are available. We investigate two scenarios: one with no supervision and another with limited supervision with access to the most frequent words. Obtained results show that it is possible to retrieve at least 27% of the gold standard vocabulary by training an encoder-decoder neural machine translation system with only 5,157 sentences. This result is close to those obtained with a task-specific Bayesian nonparametric model. Moreover, our approach has the advantage of generating translation alignments, which could be used to create a bilingual lexicon. As a future perspective, this approach is also well suited to work directly from speech.
Fichier principal
Vignette du fichier
Template.pdf (309.54 Ko) Télécharger le fichier
Origin : Files produced by the author(s)
Loading...

Dates and versions

hal-01592091 , version 1 (22-09-2017)

Identifiers

  • HAL Id : hal-01592091 , version 1

Cite

Marcely Zanon Boito, Alexandre Bérard, Aline Villavicencio, Laurent Besacier. Unwritten Languages Demand Attention Too! Word Discovery with Encoder-Decoder Models. IEEE Automatic Speech Recognition and Understanding (ASRU), Dec 2017, Okinawa, Japan. ⟨hal-01592091⟩
324 View
151 Download

Share

Gmail Facebook X LinkedIn More