Unsupervised Text Representation Learning via Instruction-Tuning for Zero-Shot Dense Retrieval

Qiuhai Zeng, Zimeng Qiu, Dae Yon Hwang, Xin He, William M. Campbell


Abstract
Dense retrieval systems are commonly used for information retrieval (IR). They rely on learning text representations through an encoder and usually require supervised modeling via labelled data which can be costly to obtain or simply unavailable. In this study, we introduce a novel unsupervised text representation learning technique via instruction-tuning the pre-trained encoder-decoder large language model (LLM) under the dual-encoder retrieval framework. We demonstrate on multiple languages that the corpus representation can be augmented by the representations of relevant synthetic queries generated by the instruct-tuned LLM founded on the Rao-Blackwell theorem. Furthermore, we effectively align the query and corpus text representation with self-instruct tuning. We evaluate our proposed method under low-resource settings on three English, two German and one Portuguese retrieval datasets measuring NDCG@10, MRR@100, Recall@100. We significantly improve the average zero-shot retrieval performance on all metrics, increasing out-of-box FLAN-T5 model variations by [4.73%, 6.15%] in absolute NDCG@10 and exceeding four supervised dense retrievers.
Anthology ID:
2024.mrl-1.22
Volume:
Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024)
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Jonne Sälevä, Abraham Owodunni
Venue:
MRL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
269–279
Language:
URL:
https://aclanthology.org/2024.mrl-1.22
DOI:
Bibkey:
Cite (ACL):
Qiuhai Zeng, Zimeng Qiu, Dae Yon Hwang, Xin He, and William M. Campbell. 2024. Unsupervised Text Representation Learning via Instruction-Tuning for Zero-Shot Dense Retrieval. In Proceedings of the Fourth Workshop on Multilingual Representation Learning (MRL 2024), pages 269–279, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Unsupervised Text Representation Learning via Instruction-Tuning for Zero-Shot Dense Retrieval (Zeng et al., MRL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.mrl-1.22.pdf