BERTEPro : A new Sentence Embedding Framework for the Education and Professional Training domain - Archive ouverte HAL
Communication Dans Un Congrès Année : 2023

BERTEPro : A new Sentence Embedding Framework for the Education and Professional Training domain

Haytham Elghazel
Théodore Guillet
Alexandre Aussem
Matthieu Sonnati

Résumé

FlauBERT and CamemBERT have established a new state-of-the-art performance for French language understanding. Recently, SBERT has transformed the use of BERT, in order to reduce the computational effort of sentence similarity, while maintaining the accuracy of BERT. However, these models have been trained on non-specific texts of the French language, which does not allow for a fine-grained representation of texts from specific domains, such as the Education and professional training domain. In this paper, we present BERTEPro, a sentence embedding framework based on FlauBERT, whose pre-training using MLM (Masked Language Modeling) has been extended on education and professional training texts, before being fine-tuned on NLI (Natural Language Inference) and STS (Semantic Textual Similarity) tasks. The performance evaluation of BERTEPro on STS tasks, as well as on classification tasks, confirmed that the proposed methodology has significant advantages over other state-of-the-art methods.
Fichier non déposé

Dates et versions

hal-04693670 , version 1 (10-09-2024)

Identifiants

Citer

Guillaume Lefebvre, Haytham Elghazel, Théodore Guillet, Alexandre Aussem, Matthieu Sonnati. BERTEPro : A new Sentence Embedding Framework for the Education and Professional Training domain. SAC '23: 38th ACM/SIGAPP Symposium on Applied Computing, Mar 2023, Tallinn Estonia, France. pp.929-935, ⟨10.1145/3555776.3577715⟩. ⟨hal-04693670⟩
11 Consultations
0 Téléchargements

Altmetric

Partager

More