NPRF: A Neural Pseudo Relevance Feedback Framework for Ad-hoc Information Retrieval

Canjia Li, Yingfei Sun, Ben He, Le Wang, Kai Hui, Andrew Yates, Le Sun, Jungang Xu


Abstract
Pseudo relevance feedback (PRF) is commonly used to boost the performance of traditional information retrieval (IR) models by using top-ranked documents to identify and weight new query terms, thereby reducing the effect of query-document vocabulary mismatches. While neural retrieval models have recently demonstrated strong results for ad-hoc retrieval, combining them with PRF is not straightforward due to incompatibilities between existing PRF approaches and neural architectures. To bridge this gap, we propose an end-to-end neural PRF framework that can be used with existing neural IR models by embedding different neural models as building blocks. Extensive experiments on two standard test collections confirm the effectiveness of the proposed NPRF framework in improving the performance of two state-of-the-art neural IR models.
Anthology ID:
D18-1478
Volume:
Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing
Month:
October-November
Year:
2018
Address:
Brussels, Belgium
Editors:
Ellen Riloff, David Chiang, Julia Hockenmaier, Jun’ichi Tsujii
Venue:
EMNLP
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
4482–4491
Language:
URL:
https://aclanthology.org/D18-1478
DOI:
10.18653/v1/D18-1478
Bibkey:
Cite (ACL):
Canjia Li, Yingfei Sun, Ben He, Le Wang, Kai Hui, Andrew Yates, Le Sun, and Jungang Xu. 2018. NPRF: A Neural Pseudo Relevance Feedback Framework for Ad-hoc Information Retrieval. In Proceedings of the 2018 Conference on Empirical Methods in Natural Language Processing, pages 4482–4491, Brussels, Belgium. Association for Computational Linguistics.
Cite (Informal):
NPRF: A Neural Pseudo Relevance Feedback Framework for Ad-hoc Information Retrieval (Li et al., EMNLP 2018)
Copy Citation:
PDF:
https://aclanthology.org/D18-1478.pdf
Code
 ucasir/NPRF
Data
Robust04