Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract) - Archive ouverte HAL
Communication Dans Un Congrès Année : 2024

Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract)

Résumé

Seq-to-seq generative models recently gained attention for solving the relation extraction task. By approaching this problem as an end-to-end task, they surpassed encoder-based-only models. Little research investigated the effects of the ouput syntaxes on the training process of these models. Moreover, a limited number of approaches were proposed for extracting ready-to-load knowledge graphs following the RDF standard. In this paper, we consider that a set of triples can be linearized in many different ways, and we evaluate the combined effect of the size of the language models and different RDF syntaxes on the task of relation extraction from Wikipedia abstracts.
AAAI_StudentAbstract_AuthorVersion.pdf (185.67 Ko) Télécharger le fichier
WellWrittentKG_studentAbstract_AAAI-5.pdf (584.93 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)
Licence
Origine Fichiers produits par l'(les) auteur(s)
Licence

Dates et versions

hal-04384724 , version 1 (29-03-2024)

Licence

Identifiants

Citer

Célian Ringwald, Fabien Gandon, Catherine Faron, Franck Michel, Hanna Abi Akl. Well-Written Knowledge Graphs: Most Effective RDF Syntaxes for Triple Linearization in End-to-End Extraction of Relations from Texts (Student Abstract). AAAI 2024 - 38th Annual AAAI Conference on Artificial Intelligence, Association for the Advancement of Artificial Intelligence, Feb 2024, Vancouver, Canada. pp.23631-23632, ⟨10.1609/aaai.v38i21.30502⟩. ⟨hal-04384724⟩
329 Consultations
30 Téléchargements

Altmetric

Partager

More