MIA 2022 Shared Task Submission: Leveraging Entity Representations, Dense-Sparse Hybrids, and Fusion-in-Decoder for Cross-Lingual Question Answering

Zhucheng Tu, Sarguna Janani Padmanabhan


Abstract
We describe our two-stage system for the Multilingual Information Access (MIA) 2022 Shared Task on Cross-Lingual Open-Retrieval Question Answering. The first stage consists of multilingual passage retrieval with a hybrid dense and sparse retrieval strategy. The second stage consists of a reader which outputs the answer from the top passages returned by the first stage. We show the efficacy of using entity representations, sparse retrieval signals to help dense retrieval, and Fusion-in-Decoder. On the development set, we obtain 43.46 F1 on XOR-TyDi QA and 21.99 F1 on MKQA, for an average F1 score of 32.73. On the test set, we obtain 40.93 F1 on XOR-TyDi QA and 22.29 F1 on MKQA, for an average F1 score of 31.61. We improve over the official baseline by over 4 F1 points on both the development and test sets.
Anthology ID:
2022.mia-1.10
Volume:
Proceedings of the Workshop on Multilingual Information Access (MIA)
Month:
July
Year:
2022
Address:
Seattle, USA
Editors:
Akari Asai, Eunsol Choi, Jonathan H. Clark, Junjie Hu, Chia-Hsuan Lee, Jungo Kasai, Shayne Longpre, Ikuya Yamada, Rui Zhang
Venue:
MIA
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
100–107
Language:
URL:
https://aclanthology.org/2022.mia-1.10
DOI:
10.18653/v1/2022.mia-1.10
Bibkey:
Cite (ACL):
Zhucheng Tu and Sarguna Janani Padmanabhan. 2022. MIA 2022 Shared Task Submission: Leveraging Entity Representations, Dense-Sparse Hybrids, and Fusion-in-Decoder for Cross-Lingual Question Answering. In Proceedings of the Workshop on Multilingual Information Access (MIA), pages 100–107, Seattle, USA. Association for Computational Linguistics.
Cite (Informal):
MIA 2022 Shared Task Submission: Leveraging Entity Representations, Dense-Sparse Hybrids, and Fusion-in-Decoder for Cross-Lingual Question Answering (Tu & Padmanabhan, MIA 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.mia-1.10.pdf
Data
MKQAMr. TYDINatural QuestionsTyDiQA