XTR meets ColBERTv2: Adding ColBERTv2 Optimizations to XTR

Riyaz Ahmad Bhat, Jaydeep Sen


Abstract
XTR (Lee et al., 2023) introduced an efficient multi-vector retrieval method that addresses the limitations of the ColBERT (Khattab and Zaharia, 2020model by simplifying retrieval into a single stage through a modified learning objective. While XTR eliminates the need for multistage retrieval, it doesn’t incorporate the efficiency optimizations from ColBERTv2 (Santhanam et al., 2022, which improve indexing and retrieval speed. In this work, we enhance XTR by integrating ColBERTv2’s optimizations, showing that the combined approach preserves the strengths of both models. This results in a more efficient and scalable solution for multi-vector retrieval, while maintaining XTR’s streamlined retrieval process.
Anthology ID:
2025.coling-industry.30
Volume:
Proceedings of the 31st International Conference on Computational Linguistics: Industry Track
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert, Kareem Darwish, Apoorv Agarwal
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
358–365
Language:
URL:
https://aclanthology.org/2025.coling-industry.30/
DOI:
Bibkey:
Cite (ACL):
Riyaz Ahmad Bhat and Jaydeep Sen. 2025. XTR meets ColBERTv2: Adding ColBERTv2 Optimizations to XTR. In Proceedings of the 31st International Conference on Computational Linguistics: Industry Track, pages 358–365, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
XTR meets ColBERTv2: Adding ColBERTv2 Optimizations to XTR (Bhat & Sen, COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-industry.30.pdf