Knowledge Base Grounded Pre-trained Language Models via Distillation - Archive ouverte HAL
Communication Dans Un Congrès Année : 2024

Knowledge Base Grounded Pre-trained Language Models via Distillation

Fichier non déposé

Dates et versions

hal-04722293 , version 1 (04-10-2024)

Identifiants

Citer

Raphaël Sourty, Jose Moreno, François-Paul Servant, Lynda Tamine. Knowledge Base Grounded Pre-trained Language Models via Distillation. SAC '24: 39th ACM/SIGAPP Symposium on Applied Computing, Apr 2024, Avila Spain, France. pp.1617-1625, ⟨10.1145/3605098.3635888⟩. ⟨hal-04722293⟩
12 Consultations
0 Téléchargements

Altmetric

Partager

More