Extending the Scope of Out-of-Domain: Examining QA models in multiple subdomains

Chenyang Lyu, Jennifer Foster, Yvette Graham


Abstract
Past work that investigates out-of-domain performance of QA systems has mainly focused on general domains (e.g. news domain, wikipedia domain), underestimating the importance of subdomains defined by the internal characteristics of QA datasets. In this paper, we extend the scope of “out-of-domain” by splitting QA examples into different subdomains according to their internal characteristics including question type, text length, answer position. We then examine the performance of QA systems trained on the data from different subdomains. Experimental results show that the performance of QA systems can be significantly reduced when the train data and test data come from different subdomains. These results question the generalizability of current QA systems in multiple subdomains, suggesting the need to combat the bias introduced by the internal characteristics of QA datasets.
Anthology ID:
2022.insights-1.4
Volume:
Proceedings of the Third Workshop on Insights from Negative Results in NLP
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Shabnam Tafreshi, João Sedoc, Anna Rogers, Aleksandr Drozd, Anna Rumshisky, Arjun Akula
Venue:
insights
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24–37
Language:
URL:
https://aclanthology.org/2022.insights-1.4
DOI:
10.18653/v1/2022.insights-1.4
Bibkey:
Cite (ACL):
Chenyang Lyu, Jennifer Foster, and Yvette Graham. 2022. Extending the Scope of Out-of-Domain: Examining QA models in multiple subdomains. In Proceedings of the Third Workshop on Insights from Negative Results in NLP, pages 24–37, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Extending the Scope of Out-of-Domain: Examining QA models in multiple subdomains (Lyu et al., insights 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.insights-1.4.pdf
Video:
 https://aclanthology.org/2022.insights-1.4.mp4
Code
 lyuchenyang/analysing-question-answering-data
Data
NewsQASQuAD