To Answer or Not To Answer? Improving Machine Reading Comprehension Model with Span-based Contrastive Learning

Yunjie Ji, Liangyu Chen, Chenxiao Dou, Baochang Ma, Xiangang Li


Abstract
Machine Reading Comprehension with Unanswerable Questions is a difficult NLP task, challenged by the questions which can not be answered from passages. It is observed that subtle literal changes often make an answerable question unanswerable, however, most MRC models fail to recognize such changes. To address this problem, in this paper, we propose a span-based method of Contrastive Learning (spanCL) which explicitly contrast answerable questions with their answerable and unanswerable counterparts at the answer span level. With spanCL, MRC models are forced to perceive crucial semantic changes from slight literal differences. Experiments on SQuAD 2.0 dataset show that spanCL can improve baselines significantly, yielding 0.86 2.14 absolute EM improvements. Additional experiments also show that spanCL is an effective way to utilize generated questions.
Anthology ID:
2022.findings-naacl.96
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1292–1300
Language:
URL:
https://aclanthology.org/2022.findings-naacl.96
DOI:
10.18653/v1/2022.findings-naacl.96
Bibkey:
Cite (ACL):
Yunjie Ji, Liangyu Chen, Chenxiao Dou, Baochang Ma, and Xiangang Li. 2022. To Answer or Not To Answer? Improving Machine Reading Comprehension Model with Span-based Contrastive Learning. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 1292–1300, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
To Answer or Not To Answer? Improving Machine Reading Comprehension Model with Span-based Contrastive Learning (Ji et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.96.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.96.mp4
Data
SQuAD