Modularized Transfomer-based Ranking Framework

Luyu Gao, Zhuyun Dai, Jamie Callan


Abstract
Recent innovations in Transformer-based ranking models have advanced the state-of-the-art in information retrieval. However, these Transformers are computationally expensive, and their opaque hidden states make it hard to understand the ranking process. In this work, we modularize the Transformer ranker into separate modules for text representation and interaction. We show how this design enables substantially faster ranking using offline pre-computed representations and light-weight online interactions. The modular design is also easier to interpret and sheds light on the ranking process in Transformer rankers.
Anthology ID:
2020.emnlp-main.342
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4180–4190
Language:
URL:
https://aclanthology.org/2020.emnlp-main.342
DOI:
10.18653/v1/2020.emnlp-main.342
Bibkey:
Cite (ACL):
Luyu Gao, Zhuyun Dai, and Jamie Callan. 2020. Modularized Transfomer-based Ranking Framework. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 4180–4190, Online. Association for Computational Linguistics.
Cite (Informal):
Modularized Transfomer-based Ranking Framework (Gao et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.342.pdf
Video:
 https://slideslive.com/38939359
Data
MS MARCO