mPMR: A Multilingual Pre-trained Machine Reader at Scale

Weiwen Xu, Xin Li, Wai Lam, Lidong Bing


Abstract
We present multilingual Pre-trained Machine Reader (mPMR), a novel method for multilingual machine reading comprehension (MRC)-style pre-training. mPMR aims to guide multilingual pre-trained language models (mPLMs) to perform natural language understanding (NLU) including both sequence classification and span extraction in multiple languages. To achieve cross-lingual generalization when only source-language fine-tuning data is available, existing mPLMs solely transfer NLU capability from a source language to target languages. In contrast, mPMR allows the direct inheritance of multilingual NLU capability from the MRC-style pre-training to downstream tasks. Therefore, mPMR acquires better NLU capability for target languages. mPMR also provides a unified solver for tackling cross-lingual span extraction and sequence classification, thereby enabling the extraction of rationales to explain the sentence-pair classification process.
Anthology ID:
2023.acl-short.131
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1533–1546
Language:
URL:
https://aclanthology.org/2023.acl-short.131
DOI:
10.18653/v1/2023.acl-short.131
Bibkey:
Cite (ACL):
Weiwen Xu, Xin Li, Wai Lam, and Lidong Bing. 2023. mPMR: A Multilingual Pre-trained Machine Reader at Scale. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 1533–1546, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
mPMR: A Multilingual Pre-trained Machine Reader at Scale (Xu et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-short.131.pdf
Video:
 https://aclanthology.org/2023.acl-short.131.mp4