Self-Supervised Test-Time Learning for Reading Comprehension

Pratyay Banerjee, Tejas Gokhale, Chitta Baral


Abstract
Recent work on unsupervised question answering has shown that models can be trained with procedurally generated question-answer pairs and can achieve performance competitive with supervised methods. In this work, we consider the task of unsupervised reading comprehension and present a method that performs “test-time learning” (TTL) on a given context (text passage), without requiring training on large-scale human-authored datasets containing context-question-answer triplets. This method operates directly on a single test context, uses self-supervision to train models on synthetically generated question-answer pairs, and then infers answers to unseen human-authored questions for this context. Our method achieves accuracies competitive with fully supervised methods and significantly outperforms current unsupervised methods. TTL methods with a smaller model are also competitive with the current state-of-the-art in unsupervised reading comprehension.
Anthology ID:
2021.naacl-main.95
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
June
Year:
2021
Address:
Online
Editors:
Kristina Toutanova, Anna Rumshisky, Luke Zettlemoyer, Dilek Hakkani-Tur, Iz Beltagy, Steven Bethard, Ryan Cotterell, Tanmoy Chakraborty, Yichao Zhou
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1200–1211
Language:
URL:
https://aclanthology.org/2021.naacl-main.95
DOI:
10.18653/v1/2021.naacl-main.95
Bibkey:
Cite (ACL):
Pratyay Banerjee, Tejas Gokhale, and Chitta Baral. 2021. Self-Supervised Test-Time Learning for Reading Comprehension. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 1200–1211, Online. Association for Computational Linguistics.
Cite (Informal):
Self-Supervised Test-Time Learning for Reading Comprehension (Banerjee et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-main.95.pdf
Video:
 https://aclanthology.org/2021.naacl-main.95.mp4
Data
Natural QuestionsNewsQAQA-SRLSQuAD