Frustratingly Hard Evidence Retrieval for QA Over Books

Xiangyang Mou, Mo Yu, Bingsheng Yao, Chenghao Yang, Xiaoxiao Guo, Saloni Potdar, Hui Su


Abstract
A lot of progress has been made to improve question answering (QA) in recent years, but the special problem of QA over narrative book stories has not been explored in-depth. We formulate BookQA as an open-domain QA task given its similar dependency on evidence retrieval. We further investigate how state-of-the-art open-domain QA approaches can help BookQA. Besides achieving state-of-the-art on the NarrativeQA benchmark, our study also reveals the difficulty of evidence retrieval in books with a wealth of experiments and analysis - which necessitates future effort on novel solutions for evidence retrieval in BookQA.
Anthology ID:
2020.nuse-1.13
Volume:
Proceedings of the First Joint Workshop on Narrative Understanding, Storylines, and Events
Month:
July
Year:
2020
Address:
Online
Editors:
Claire Bonial, Tommaso Caselli, Snigdha Chaturvedi, Elizabeth Clark, Ruihong Huang, Mohit Iyyer, Alejandro Jaimes, Heng Ji, Lara J. Martin, Ben Miller, Teruko Mitamura, Nanyun Peng, Joel Tetreault
Venues:
NUSE | WNU
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
108–113
Language:
URL:
https://aclanthology.org/2020.nuse-1.13
DOI:
10.18653/v1/2020.nuse-1.13
Bibkey:
Cite (ACL):
Xiangyang Mou, Mo Yu, Bingsheng Yao, Chenghao Yang, Xiaoxiao Guo, Saloni Potdar, and Hui Su. 2020. Frustratingly Hard Evidence Retrieval for QA Over Books. In Proceedings of the First Joint Workshop on Narrative Understanding, Storylines, and Events, pages 108–113, Online. Association for Computational Linguistics.
Cite (Informal):
Frustratingly Hard Evidence Retrieval for QA Over Books (Mou et al., NUSE-WNU 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.nuse-1.13.pdf
Video:
 http://slideslive.com/38929752
Data
NarrativeQA