A BERT-based Siamese-structured Retrieval Model

Hung-Yun Chiang, Kuan-Yu Chen


Abstract
Due to the development of deep learning, the natural language processing tasks have made great progresses by leveraging the bidirectional encoder representations from Transformers (BERT). The goal of information retrieval is to search the most relevant results for the user’s query from a large set of documents. Although BERT-based retrieval models have shown excellent results in many studies, these models usually suffer from the need for large amounts of computations and/or additional storage spaces. In view of the flaws, a BERT-based Siamese-structured retrieval model (BESS) is proposed in this paper. BESS not only inherits the merits of pre-trained language models, but also can generate extra information to compensate the original query automatically. Besides, the reinforcement learning strategy is introduced to make the model more robust. Accordingly, we evaluate BESS on three public-available corpora, and the experimental results demonstrate the efficiency of the proposed retrieval model.
Anthology ID:
2021.rocling-1.22
Volume:
Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing (ROCLING 2021)
Month:
October
Year:
2021
Address:
Taoyuan, Taiwan
Editors:
Lung-Hao Lee, Chia-Hui Chang, Kuan-Yu Chen
Venue:
ROCLING
SIG:
Publisher:
The Association for Computational Linguistics and Chinese Language Processing (ACLCLP)
Note:
Pages:
163–172
Language:
URL:
https://aclanthology.org/2021.rocling-1.22
DOI:
Bibkey:
Cite (ACL):
Hung-Yun Chiang and Kuan-Yu Chen. 2021. A BERT-based Siamese-structured Retrieval Model. In Proceedings of the 33rd Conference on Computational Linguistics and Speech Processing (ROCLING 2021), pages 163–172, Taoyuan, Taiwan. The Association for Computational Linguistics and Chinese Language Processing (ACLCLP).
Cite (Informal):
A BERT-based Siamese-structured Retrieval Model (Chiang & Chen, ROCLING 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.rocling-1.22.pdf
Data
MS MARCOMovieQA