On Refining BERT Contextualized Embeddings using Semantic Lexicons - Archive ouverte HAL
Communication Dans Un Congrès Année : 2021

On Refining BERT Contextualized Embeddings using Semantic Lexicons

Georgios Zervakis
  • Fonction : Auteur
  • PersonId : 1107247
Miguel Couceiro
Marc Schoenauer

Résumé

Word vector representations play a fundamental role in many NLP applications. Exploiting human-curated knowledge was proven to improve the quality of word embeddings and their performance on many downstream tasks. Retrofitting is a simple and popular technique for refining distributional word embeddings based on relations coming from a semantic lexicon. Inspired by this technique, we present two methods for incorporating knowledge into contextualized embeddings. We evaluate these methods with BERT embeddings on three biomedical datasets for relation extraction and one movie review dataset for sentiment analysis. We demonstrate that the retrofitted vectors do not substantially impact the performance for these tasks, and conduct a qualitative analysis to provide further insights on this negative result.
Fichier principal
Vignette du fichier
On_Refining_BERT_Contextualized_Embeddings_using_Semantic_Lexicons__CSSA.pdf (297.79 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-03318571 , version 1 (10-08-2021)

Identifiants

  • HAL Id : hal-03318571 , version 1

Citer

Georgios Zervakis, Emmanuel Vincent, Miguel Couceiro, Marc Schoenauer. On Refining BERT Contextualized Embeddings using Semantic Lexicons. ECML PKDD 2021 - Machine Learning with Symbolic Methods and Knowledge Graphs co-located with European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases, Sep 2021, Online, Spain. ⟨hal-03318571⟩
343 Consultations
645 Téléchargements

Partager

More