Multilingual Irony Detection with Dependency Syntax and Neural Models - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2020

Multilingual Irony Detection with Dependency Syntax and Neural Models

Résumé

This paper presents an in-depth investigation of the effectiveness of dependency-based syntactic features on the irony detection task in a multilingual perspective (English, Spanish, French and Italian). It focuses on the contribution from syntactic knowledge, exploiting linguistic resources where syntax is annotated according to the Universal Dependencies scheme. Three distinct experimental settings are provided. In the first, a variety of syntactic dependency-based features combined with classical machine learning classifiers are explored. In the second scenario, two well-known types of word embeddings are trained on parsed data and tested against gold standard datasets. In the third setting, dependency-based syntactic features are combined into the Multilingual BERT architecture. The results suggest that fine-grained dependency-based syntactic information is informative for the detection of irony.

Dates et versions

hal-03102480 , version 1 (07-01-2021)

Licence

Paternité - Pas d'utilisation commerciale - Pas de modification

Identifiants

Citer

Alessandra Teresa Signarella, Valerio Basile, Manuela Sanguinetti, Cristina Bosco, Paolo Rosso, et al.. Multilingual Irony Detection with Dependency Syntax and Neural Models. 28th International Conference on Computational Linguistics (COLING 2020), Dec 2020, Barcelona (Online), Spain. pp.1346-1358. ⟨hal-03102480⟩
44 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More