Task-Aware Specialization for Efficient and Robust Dense Retrieval for Open-Domain Question Answering

Hao Cheng, Hao Fang, Xiaodong Liu, Jianfeng Gao


Abstract
Given its effectiveness on knowledge-intensive natural language processing tasks, dense retrieval models have become increasingly popular. Specifically, the de-facto architecture for open-domain question answering uses two isomorphic encoders that are initialized from the same pretrained model but separately parameterized for questions and passages. This biencoder architecture is parameter-inefficient in that there is no parameter sharing between encoders. Further, recent studies show that such dense retrievers underperform BM25 in various settings. We thus propose a new architecture, Task-Aware Specialization for dEnse Retrieval (TASER), which enables parameter sharing by interleaving shared and specialized blocks in a single encoder. Our experiments on five question answering datasets show that TASER can achieve superior accuracy, surpassing BM25, while using about 60% of the parameters as bi-encoder dense retrievers. In out-of-domain evaluations, TASER is also empirically more robust than bi-encoder dense retrievers. Our code is available at https://github.com/microsoft/taser.
Anthology ID:
2023.acl-short.159
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1864–1875
Language:
URL:
https://aclanthology.org/2023.acl-short.159
DOI:
10.18653/v1/2023.acl-short.159
Bibkey:
Cite (ACL):
Hao Cheng, Hao Fang, Xiaodong Liu, and Jianfeng Gao. 2023. Task-Aware Specialization for Efficient and Robust Dense Retrieval for Open-Domain Question Answering. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1864–1875, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Task-Aware Specialization for Efficient and Robust Dense Retrieval for Open-Domain Question Answering (Cheng et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.159.pdf
Video:
 https://aclanthology.org/2023.acl-short.159.mp4