Cross-lingual Strategies for Low-resource Language Modeling: A Study on Five Indic Dialects - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2023

Cross-lingual Strategies for Low-resource Language Modeling: A Study on Five Indic Dialects

Résumé

Neural language models play an increasingly central role for language processing, given their success for a range of NLP tasks. In this study, we compare some canonical strategies in language modeling for low-resource scenarios, evaluating all models by their (finetuned) performance on a POS-tagging downstream task. We work with five (extremely) low-resource dialects from the Indic dialect continuum (Braj, Awadhi, Bhojpuri, Magahi, Maithili), which are closely related to each other and the standard mid-resource dialect, Hindi. The strategies we evaluate broadly include from-scratch pretraining, and cross-lingual transfer between the dialects as well as from different kinds of off-the- shelf multilingual models; we find that a model pretrained on other mid-resource Indic dialects and languages, with extended pretraining on target dialect data, consistently outperforms other models. We interpret our results in terms of dataset sizes, phylogenetic relationships, and corpus statistics, as well as particularities of this linguistic system.
Fichier principal
Vignette du fichier
461064.pdf (3.53 Mo) Télécharger le fichier
Origine : Fichiers éditeurs autorisés sur une archive ouverte

Dates et versions

hal-04130175 , version 1 (20-06-2023)

Identifiants

  • HAL Id : hal-04130175 , version 1

Citer

Niyati Bafna, Cristina España-Bonet, Josef van Genabith, Benoît Sagot, Rachel Bawden. Cross-lingual Strategies for Low-resource Language Modeling: A Study on Five Indic Dialects. 18e Conférence en Recherche d'Information et Applications -- 16e Rencontres Jeunes Chercheurs en RI -- 30e Conférence sur le Traitement Automatique des Langues Naturelles -- 25e Rencontre des Étudiants Chercheurs en Informatique pour le Traitement Automatique des Langues, Jun 2023, Paris, France. pp.28-42. ⟨hal-04130175⟩
126 Consultations
29 Téléchargements

Partager

Gmail Facebook X LinkedIn More