Learning with Instance Bundles for Reading Comprehension

Dheeru Dua, Pradeep Dasigi, Sameer Singh, Matt Gardner


Abstract
When training most modern reading comprehension models, all the questions associated with a context are treated as being independent from each other. However, closely related questions and their corresponding answers are not independent, and leveraging these relationships could provide a strong supervision signal to a model. Drawing on ideas from contrastive estimation, we introduce several new supervision losses that compare question-answer scores across multiple related instances. Specifically, we normalize these scores across various neighborhoods of closely contrasting questions and/or answers, adding a cross entropy loss term in addition to traditional maximum likelihood estimation. Our techniques require bundles of related question-answer pairs, which we either mine from within existing data or create using automated heuristics. We empirically demonstrate the effectiveness of training with instance bundles on two datasets—HotpotQA and ROPES—showing up to 9% absolute gains in accuracy.
Anthology ID:
2021.emnlp-main.584
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7347–7357
Language:
URL:
https://aclanthology.org/2021.emnlp-main.584
DOI:
10.18653/v1/2021.emnlp-main.584
Bibkey:
Cite (ACL):
Dheeru Dua, Pradeep Dasigi, Sameer Singh, and Matt Gardner. 2021. Learning with Instance Bundles for Reading Comprehension. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 7347–7357, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Learning with Instance Bundles for Reading Comprehension (Dua et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.584.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.584.mp4
Data
HotpotQAQuorefROPES