Label-Dependencies Aware Recurrent Neural Networks - Archive ouverte HAL
Communication Dans Un Congrès Année : 2017

Label-Dependencies Aware Recurrent Neural Networks

Résumé

In the last few years, Recurrent Neural Networks (RNNs) have proved effective on several NLP tasks. Despite such great success, their ability to model sequence labeling is still limited. This lead research toward solutions where RNNs are combined with models which already proved effective in this domain, such as CRFs. In this work we propose a solution far simpler but very effective: an evolution of the simple Jordan RNN, where labels are re-injected as input into the network, and converted into embeddings, in the same way as words. We compare this RNN variant to all the other RNN models, Elman and Jordan RNN, LSTM and GRU, on two well-known tasks of Spoken Language Understanding (SLU). Thanks to label embeddings and their combination at the hidden layer, the proposed variant, which uses more parameters than Elman and Jordan RNNs, but far fewer than LSTM and GRU, is more effective than other RNNs, but also outperforms sophisticated CRF models.
Fichier principal
Vignette du fichier
2017Cicling_NewRNN_forarXiv.pdf (409.97 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Loading...

Dates et versions

hal-01579071 , version 1 (30-08-2017)

Identifiants

  • HAL Id : hal-01579071 , version 1

Citer

Yoann Dupont, Marco Dinarelli, Isabelle Tellier. Label-Dependencies Aware Recurrent Neural Networks. Intelligent Text Processing and Computational Linguistics (CICling), Apr 2017, Budapest, Hungary. ⟨hal-01579071⟩
140 Consultations
136 Téléchargements

Partager

More