Unsupervised extractive multi-document summarization method based on transfer learning from BERT multi-task fine-tuning - Archive ouverte HAL
Article Dans Une Revue Journal of Information Science Année : 2021

Unsupervised extractive multi-document summarization method based on transfer learning from BERT multi-task fine-tuning

Résumé

Text representation is a fundamental cornerstone that impacts the effectiveness of several text summarization methods. Transfer learning using pre-trained word embedding models has shown promising results. However, most of these representations do not consider the order and the semantic relationships between words in a sentence, and thus they do not carry the meaning of a full sentence. To overcome this issue, the current study proposes an unsupervised method for extractive multi-document summarization based on transfer learning from BERT sentence embedding model. Moreover, to improve sentence representation learning, we fine-tune BERT model on supervised intermediate tasks from GLUE benchmark datasets using single-task and multi-task fine-tuning methods. Experiments are performed on the standard DUC’2002–2004 datasets. The obtained results show that our method has significantly outperformed several baseline methods and achieves a comparable and sometimes better performance than the recent state-of-the-art deep learning–based methods. Furthermore, the results show that fine-tuning BERT using multi-task learning has considerably improved the performance.
Fichier non déposé

Dates et versions

hal-03594048 , version 1 (02-03-2022)

Identifiants

Citer

Salima Lamsiyah, Abdelkader El Mahdaouy, Saïd El Alaoui Ouatik, Bernard Espinasse. Unsupervised extractive multi-document summarization method based on transfer learning from BERT multi-task fine-tuning. Journal of Information Science, 2021, pp.016555152199061. ⟨10.1177/0165551521990616⟩. ⟨hal-03594048⟩
17 Consultations
0 Téléchargements

Altmetric

Partager

More