Building an Enhanced Autoregressive Document Retriever Leveraging Supervised Contrastive Learning

Yi-Cheng Wang, Tzu-Ting Yang, Hsin-Wei Wang, Yung-Chang Hsu, Berlin Chen


Abstract
The goal of an information retrieval system is to retrieve documents that are most relevant to a given user query from a huge collection of documents, which usually requires time-consuming multiple comparisons between the query and candidate documents so as to find the most relevant ones. Recently, a novel retrieval modeling approach, dubbed Differentiable Search Index (DSI), has been proposed. DSI dramatically simplifies the whole retrieval process by encoding all information about the document collection into the parameter space of a single Transformer model, on top of which DSI can in turn generate the relevant document identities (IDs) in an autoregressive manner in response to a user query. Although DSI addresses the shortcomings of traditional retrieval systems, previous studies have pointed out that DSI might fail to retrieve relevant documents because DSI uses the document IDs as the pivotal mechanism to establish the relationship between queries and documents, whereas not every document in the document collection has its corresponding relevant and irrelevant queries for the training purpose. In view of this, we put forward to leveraging supervised contrastive learning to better render the relationship between queries and documents in the latent semantic space. Furthermore, an approximate nearest neighbor search strategy is employed at retrieval time to further assist the Transformer model in generating document IDs relevant to a posed query more efficiently. A series of experiments conducted on the Nature Question benchmark dataset confirm the effectiveness and practical feasibility of our approach in relation to some strong baseline systems.
Anthology ID:
2022.rocling-1.34
Volume:
Proceedings of the 34th Conference on Computational Linguistics and Speech Processing (ROCLING 2022)
Month:
November
Year:
2022
Address:
Taipei, Taiwan
Editors:
Yung-Chun Chang, Yi-Chin Huang
Venue:
ROCLING
SIG:
Publisher:
The Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Note:
Pages:
273–282
Language:
Chinese
URL:
https://aclanthology.org/2022.rocling-1.34
DOI:
Bibkey:
Cite (ACL):
Yi-Cheng Wang, Tzu-Ting Yang, Hsin-Wei Wang, Yung-Chang Hsu, and Berlin Chen. 2022. Building an Enhanced Autoregressive Document Retriever Leveraging Supervised Contrastive Learning. In Proceedings of the 34th Conference on Computational Linguistics and Speech Processing (ROCLING 2022), pages 273–282, Taipei, Taiwan. The Association for Computational Linguistics and Chinese Language Processing (ACLCLP).
Cite (Informal):
Building an Enhanced Autoregressive Document Retriever Leveraging Supervised Contrastive Learning (Wang et al., ROCLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.rocling-1.34.pdf
Data
Natural Questions