Narrative Question Answering with Cutting-Edge Open-Domain QA Techniques: A Comprehensive Study

Xiangyang Mou, Chenghao Yang, Mo Yu, Bingsheng Yao, Xiaoxiao Guo, Saloni Potdar, Hui Su


Abstract
Recent advancements in open-domain question answering (ODQA), that is, finding answers from large open-domain corpus like Wikipedia, have led to human-level performance on many datasets. However, progress in QA over book stories (Book QA) lags despite its similar task formulation to ODQA. This work provides a comprehensive and quantitative analysis about the difficulty of Book QA: (1) We benchmark the research on the NarrativeQA dataset with extensive experiments with cutting-edge ODQA techniques. This quantifies the challenges Book QA poses, as well as advances the published state-of-the-art with a ∼7% absolute improvement on ROUGE-L. (2) We further analyze the detailed challenges in Book QA through human studies.1 Our findings indicate that the event-centric questions dominate this task, which exemplifies the inability of existing QA models to handle event-oriented scenarios.
Anthology ID:
2021.tacl-1.61
Volume:
Transactions of the Association for Computational Linguistics, Volume 9
Month:
Year:
2021
Address:
Cambridge, MA
Editors:
Brian Roark, Ani Nenkova
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1032–1046
Language:
URL:
https://aclanthology.org/2021.tacl-1.61
DOI:
10.1162/tacl_a_00411
Bibkey:
Cite (ACL):
Xiangyang Mou, Chenghao Yang, Mo Yu, Bingsheng Yao, Xiaoxiao Guo, Saloni Potdar, and Hui Su. 2021. Narrative Question Answering with Cutting-Edge Open-Domain QA Techniques: A Comprehensive Study. Transactions of the Association for Computational Linguistics, 9:1032–1046.
Cite (Informal):
Narrative Question Answering with Cutting-Edge Open-Domain QA Techniques: A Comprehensive Study (Mou et al., TACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.tacl-1.61.pdf
Video:
 https://aclanthology.org/2021.tacl-1.61.mp4