Once is Enough: A Light-Weight Cross-Attention for Fast Sentence Pair Modeling

Yuanhang Yang, Shiyi Qi, Chuanyi Liu, Qifan Wang, Cuiyun Gao, Zenglin Xu


Abstract
Transformer-based models have achieved great success on sentence pair modeling tasks, such as answer selection and natural language inference (NLI). These models generally perform cross-attention over input pairs, leading to prohibitive computational cost. Recent studies propose dual-encoder and late interaction architectures for faster computation. However, the balance between the expressive of cross-attention and computation speedup still needs better coordinated. To this end, this paper introduces a novel paradigm TopicAns for efficient sentence pair modeling. TopicAns involves a lightweight cross-attention mechanism. It conducts query encoding only once while modeling the query-candidate interaction in parallel. Extensive experiments conducted on four tasks demonstrate that our TopicAnscan speed up sentence pairing by over 113x while achieving comparable performance as the more expensive cross-attention models.
Anthology ID:
2023.emnlp-main.168
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2800–2806
Language:
URL:
https://aclanthology.org/2023.emnlp-main.168
DOI:
10.18653/v1/2023.emnlp-main.168
Bibkey:
Cite (ACL):
Yuanhang Yang, Shiyi Qi, Chuanyi Liu, Qifan Wang, Cuiyun Gao, and Zenglin Xu. 2023. Once is Enough: A Light-Weight Cross-Attention for Fast Sentence Pair Modeling. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 2800–2806, Singapore. Association for Computational Linguistics.
Cite (Informal):
Once is Enough: A Light-Weight Cross-Attention for Fast Sentence Pair Modeling (Yang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.168.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.168.mp4