Empowering Dual-Encoder with Query Generator for Cross-Lingual Dense Retrieval

Houxing Ren, Linjun Shou, Ning Wu, Ming Gong, Daxin Jiang


Abstract
In monolingual dense retrieval, lots of works focus on how to distill knowledge from cross-encoder re-ranker to dual-encoder retriever and these methods achieve better performance due to the effectiveness of cross-encoder re-ranker. However, we find that the performance of the cross-encoder re-ranker is heavily influenced by the number of training samples and the quality of negative samples, which is hard to obtain in the cross-lingual setting. In this paper, we propose to use a query generator as the teacher in the cross-lingual setting, which is less dependent on enough training samples and high-quality negative samples. In addition to traditional knowledge distillation, we further propose a novel enhancement method, which uses the query generator to help the dual-encoder align queries from different languages, but does not need any additional parallel sentences. The experimental results show that our method outperforms the state-of-the-art methods on two benchmark datasets.
Anthology ID:
2022.emnlp-main.203
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3107–3121
Language:
URL:
https://aclanthology.org/2022.emnlp-main.203
DOI:
10.18653/v1/2022.emnlp-main.203
Bibkey:
Cite (ACL):
Houxing Ren, Linjun Shou, Ning Wu, Ming Gong, and Daxin Jiang. 2022. Empowering Dual-Encoder with Query Generator for Cross-Lingual Dense Retrieval. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 3107–3121, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Empowering Dual-Encoder with Query Generator for Cross-Lingual Dense Retrieval (Ren et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.203.pdf