A Glitch in the Matrix? Locating and Detecting Language Model Grounding with Fakepedia - Archive ouverte HAL
Communication Dans Un Congrès Année : 2024

A Glitch in the Matrix? Locating and Detecting Language Model Grounding with Fakepedia

Résumé

Large language models (LLMs) have an impressive ability to draw on novel information supplied in their context. Yet the mechanisms underlying this contextual grounding remain unknown, especially in situations where contextual information contradicts factual knowledge stored in the parameters, which LLMs also excel at recalling. Favoring the contextual information is critical for retrieval-augmented generation methods, which enrich the context with up-to-date information, hoping that grounding can rectify outdated or noisy stored knowledge. We present a novel method to study grounding abilities using Fakepedia, a novel dataset of counterfactual texts constructed to clash with a model's internal parametric knowledge. We benchmark various LLMs with Fakepedia and conduct a causal mediation analysis of LLM components when answering Fakepedia queries, based on our Masked Grouped Causal Tracing (MGCT) method. Through this analysis, we identify distinct computational patterns between grounded and ungrounded responses. We finally demonstrate that distinguishing grounded from ungrounded responses is achievable through computational analysis alone. Our results, together with existing findings about factual recall mechanisms, provide a coherent narrative of how grounding and factual recall mechanisms interact within LLMs.
Fichier principal
Vignette du fichier
Paper_hallucinations.pdf (696.91 Ko) Télécharger le fichier
Origine Fichiers produits par l'(les) auteur(s)

Dates et versions

hal-04701145 , version 1 (20-09-2024)

Identifiants

Citer

Giovanni Monea, Maxime Peyrard, Martin Josifoski, Vishrav Chaudhary, Jason Eisner, et al.. A Glitch in the Matrix? Locating and Detecting Language Model Grounding with Fakepedia. ACL 2024, Aug 2024, Bangkok, Thailand. pp.6828-6844, ⟨10.18653/v1/2024.acl-long.369⟩. ⟨hal-04701145⟩
58 Consultations
62 Téléchargements

Altmetric

Partager

More