Corpus-Steered Query Expansion with Large Language Models

Yibin Lei, Yu Cao, Tianyi Zhou, Tao Shen, Andrew Yates


Abstract
Recent studies demonstrate that query expansions generated by large language models (LLMs) can considerably enhance information retrieval systems by generating hypothetical documents that answer the queries as expansions. However, challenges arise from misalignments between the expansions and the retrieval corpus, resulting in issues like hallucinations and outdated information due to the limited intrinsic knowledge of LLMs. Inspired by Pseudo Relevance Feedback (PRF), we introduce Corpus-Steered Query Expansion (CSQE) to promote the incorporation of knowledge embedded within the corpus. CSQE utilizes the relevance assessing capability of LLMs to systematically identify pivotal sentences in the initially-retrieved documents. These corpus-originated texts are subsequently used to expand the query together with LLM-knowledge empowered expansions, improving the relevance prediction between the query and the target documents. Extensive experiments reveal that CSQE exhibits strong performance without necessitating any training, especially with queries for which LLMs lack knowledge.
Anthology ID:
2024.eacl-short.34
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
393–401
Language:
URL:
https://aclanthology.org/2024.eacl-short.34
DOI:
Bibkey:
Cite (ACL):
Yibin Lei, Yu Cao, Tianyi Zhou, Tao Shen, and Andrew Yates. 2024. Corpus-Steered Query Expansion with Large Language Models. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 2: Short Papers), pages 393–401, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Corpus-Steered Query Expansion with Large Language Models (Lei et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-short.34.pdf
Software:
 2024.eacl-short.34.software.zip