MarkedBERT: Integrating Traditional IR Cues in Pre-trained Language Models for Passage Retrieval
Résumé
The Information Retrieval (IR) community has witnessed a flourishing development of deep neural networks, however, only a fewmanaged to beat strong baselines. Among them, models like DRMM and DUET were able to achieve better results thanks to the properhandling of exact match signals. Nowadays, the application of pre-trained language models to IR tasks has achieved impressive resultsexceeding all previous work. In this paper, we assume that established IR cues like exact term-matching, proven to be valuable fordeep neural models, can be used to augment the direct supervision from labeled data for training these pre-trained models. To studythe effectiveness of this assumption, we propose MarkedBERT a modified version of one of the most popular pre-trained modelsvia language modeling tasks, BERT. MarkedBERT integrates exact match signals using a marking technique that locates and highlightsExact Matched query-document terms using marker tokens. Experiments on MS MARCO Passage Ranking task show that our rathersimple approach is actually effective. We find that augmenting the input with marker tokens allows the model to focus on valuabletext sequences for IR.
Origine | Fichiers produits par l'(les) auteur(s) |
---|