AraBART: a Pretrained Arabic Sequence-to-Sequence Model for Abstractive Summarization - Archive ouverte HAL Accéder directement au contenu
Communication Dans Un Congrès Année : 2022

Dates et versions

hal-04494404 , version 1 (07-03-2024)

Identifiants

Citer

Moussa Kamal Eddine, Nadi Tomeh, Nizar Habash, Joseph Le Roux, Michalis Vazirgiannis. AraBART: a Pretrained Arabic Sequence-to-Sequence Model for Abstractive Summarization. Proceedings of the The Seventh Arabic Natural Language Processing Workshop (WANLP), Dec 2022, Abu Dhabi, United Arab Emirates. pp.31-42, ⟨10.18653/v1/2022.wanlp-1.4⟩. ⟨hal-04494404⟩
16 Consultations
0 Téléchargements

Altmetric

Partager

Gmail Mastodon Facebook X LinkedIn More