ForceReader: a BERT-based Interactive Machine Reading Comprehension Model with Attention Separation

Zheng Chen, Kangjian Wu


Abstract
The release of BERT revolutionized the development of NLP. Various BERT-based reading comprehension models have been proposed, thus updating the performance ranking of reading comprehension tasks. However, the above BERT-based models inherently employ BERT’s combined input method, representing the input question and paragraph as a single packed sequence, without further modification for reading comprehension. This paper makes an in-depth analysis of this input method, proposes a problem of this approach. We call it attention deconcentration. Accordingly, this paper proposes ForceReader, a BERT-based interactive machine reading comprehension model. First, ForceReader proposes a novel solution called the Attention Separation Representation to respond to attention deconcentration. Moreover, starting from the logical nature of reading comprehension tasks, ForceReader adopts Multi-mode Reading and Interactive Reasoning strategy. For the calculation of attention, ForceReader employs Conditional Background Attention to solve the lack of the overall context semantic after the separation of attention. As an integral model, ForceReader shows a significant improvement in reading comprehension tasks compared to BERT. Moreover, this paper makes detailed visual analyses of the attention and propose strategies accordingly. This may be another argument to the explanations of the attention.
Anthology ID:
2020.coling-main.241
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2676–2686
Language:
URL:
https://aclanthology.org/2020.coling-main.241
DOI:
10.18653/v1/2020.coling-main.241
Bibkey:
Cite (ACL):
Zheng Chen and Kangjian Wu. 2020. ForceReader: a BERT-based Interactive Machine Reading Comprehension Model with Attention Separation. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2676–2686, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
ForceReader: a BERT-based Interactive Machine Reading Comprehension Model with Attention Separation (Chen & Wu, COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.241.pdf
Data
SQuAD