Does Structure Matter? Encoding Documents for Machine Reading Comprehension

Hui Wan, Song Feng, Chulaka Gunasekara, Siva Sankalp Patel, Sachindra Joshi, Luis Lastras


Abstract
Machine reading comprehension is a challenging task especially for querying documents with deep and interconnected contexts. Transformer-based methods have shown advanced performances on this task; however, most of them still treat documents as a flat sequence of tokens. This work proposes a new Transformer-based method that reads a document as tree slices. It contains two modules for identifying more relevant text passage and the best answer span respectively, which are not only jointly trained but also jointly consulted at inference time. Our evaluation results show that our proposed method outperforms several competitive baseline approaches on two datasets from varied domains.
Anthology ID:
2021.naacl-main.367
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4626–4634
Language:
URL:
https://aclanthology.org/2021.naacl-main.367
DOI:
10.18653/v1/2021.naacl-main.367
Bibkey:
Cite (ACL):
Hui Wan, Song Feng, Chulaka Gunasekara, Siva Sankalp Patel, Sachindra Joshi, and Luis Lastras. 2021. Does Structure Matter? Encoding Documents for Machine Reading Comprehension. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 4626–4634, Online. Association for Computational Linguistics.
Cite (Informal):
Does Structure Matter? Encoding Documents for Machine Reading Comprehension (Wan et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.367.pdf
Video:
 https://aclanthology.org/2021.naacl-main.367.mp4
Data
Doc2DialSQuADdoc2dial