Multi-Task Sequence Prediction For Tunisian Arabizi Multi-Level Annotation - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Multi-Task Sequence Prediction For Tunisian Arabizi Multi-Level Annotation

Elisa Gugliotta
Marco Dinarelli
Olivier Kraif

Résumé

In this paper we propose a multi-task sequence prediction system, based on recurrent neural networks and used to annotate on multiple levels an Arabizi Tunisian corpus. The annotation performed are text classification, tokenization, PoS tagging and encoding of Tunisian Arabizi into CODA* Arabic orthography. The system is learned to predict all the annotation levels in cascade, starting from Arabizi input. We evaluate the system on the TIGER German corpus, suitably converting data to have a multi-task problem, in order to show the effectiveness of our neural architecture. We show also how we used the system in order to annotate a Tunisian Arabizi corpus, which has been afterwards manually corrected and used to further evaluate sequence models on Tunisian data. Our system is developed for the Fairseq framework, which allows for a fast and easy use for any other sequence prediction problem.
Fichier principal
Vignette du fichier
2020.wanlp-1.16.pdf (265.35 Ko) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-03096031 , version 1 (13-07-2021)

Identifiants

  • HAL Id : hal-03096031 , version 1

Citer

Elisa Gugliotta, Marco Dinarelli, Olivier Kraif. Multi-Task Sequence Prediction For Tunisian Arabizi Multi-Level Annotation. The Fifth Arabic Natural Language Processing Workshop (WANLP), Dec 2020, Barcelone, Spain. pp.178-191. ⟨hal-03096031⟩
61 Consultations
74 Téléchargements

Partager

Gmail Facebook X LinkedIn More