Improving text mining in plant health domain with GAN and/or pre-trained language model - Archive ouverte HAL Accéder directement au contenu
Article Dans Une Revue Frontiers in Artificial Intelligence Année : 2023

Improving text mining in plant health domain with GAN and/or pre-trained language model

Résumé

The Bidirectional Encoder Representations from Transformers (BERT) architecture offers a cutting-edge approach to Natural Language Processing. It involves two steps: 1) pre-training a language model to extract contextualized features and 2) fine-tuning for specific downstream tasks. Although pre-trained language models (PLMs) have been successful in various text-mining applications, challenges remain, particularly in areas with limited labeled data such as plant health hazard detection from individuals' observations. To address this challenge, we propose to combine GAN-BERT, a model that extends the fine-tuning process with unlabeled data through a Generative Adversarial Network (GAN), with ChouBERT, a domain-specific PLM. Our results show that GAN-BERT outperforms traditional fine-tuning in multiple text classification tasks. In this paper, we examine the impact of further pre-training on the GAN-BERT model. We experiment with different hyper parameters to determine the best combination of models and fine-tuning parameters. Our findings suggest that the combination of GAN and ChouBERT can enhance the generalizability of the text classifier but may also lead to increased instability during training. Finally, we provide recommendations to mitigate these instabilities.
Fichier principal
Vignette du fichier
GAN_ChouBERT.pdf (2.97 Mo) Télécharger le fichier
Origine : Publication financée par une institution
Licence : CC BY - Paternité

Dates et versions

hal-04008864 , version 1 (26-09-2023)

Identifiants

Citer

Shufan Jiang, Stéphane Cormier, Rafael Angarita, Francis Rousseaux. Improving text mining in plant health domain with GAN and/or pre-trained language model. Frontiers in Artificial Intelligence, 2023, 6, pp.1072329. ⟨10.3389/frai.2023.1072329⟩. ⟨hal-04008864⟩
77 Consultations
11 Téléchargements

Altmetric

Partager

Gmail Facebook X LinkedIn More