MELODI at SemEval-2023 Task 3: In-domain Pre-training for Low-resource Classification of News Articles - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

MELODI at SemEval-2023 Task 3: In-domain Pre-training for Low-resource Classification of News Articles

Résumé

This paper describes our approach to Subtask 1 "News Genre Categorization" of SemEval-2023 Task 3 "Detecting the Category, the Framing, and the Persuasion Techniques in Online News in a Multilingual Setup", which aims to determine whether a given news article is an opinion piece, an objective report, or satirical. We fine-tuned the domain-specific language model POLITICS, which was pre-trained on a large-scale dataset of more than 3.6M English political news articles following ideologydriven pre-training objectives. In order to use it in the multilingual setup of the task, we added as a pre-processing step the translation of all documents into English. Our system ranked among the top systems overall in most language, and ranked 1st on the English dataset.
Fichier principal
Vignette du fichier
2023.semeval-1.14.pdf (218.48 Ko) Télécharger le fichier
Origine : Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04310778 , version 1 (27-11-2023)

Identifiants

Citer

Nicolas Devatine, Philippe Muller, Chloé Braud. MELODI at SemEval-2023 Task 3: In-domain Pre-training for Low-resource Classification of News Articles. Proceedings of the The 17th International Workshop on Semantic Evaluation (SemEval-2023), SIGLEX: Special Interest Group on the Lexicon of the Association for Computational Linguistics, Jul 2023, Toronto, Canada. pp.108-113, ⟨10.18653/v1/2023.semeval-1.14⟩. ⟨hal-04310778⟩
20 Consultations
6 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More