Robust Question Answering against Distribution Shifts with Test-Time Adaption: An Empirical Study

Hai Ye, Yuyang Ding, Juntao Li, Hwee Tou Ng


Abstract
A deployed question answering (QA) model can easily fail when the test data has a distribution shift compared to the training data. Robustness tuning (RT) methods have been widely studied to enhance model robustness against distribution shifts before model deployment. However, can we improve a model after deployment? To answer this question, we evaluate test-time adaptation (TTA) to improve a model after deployment. We first introduce ColdQA, a unified evaluation benchmark for robust QA against text corruption and changes in language and domain. We then evaluate previous TTA methods on ColdQA and compare them to RT methods. We also propose a novel TTA method called online imitation learning (OIL). Through extensive experiments, we find that TTA is comparable to RT methods, and applying TTA after RT can significantly boost the performance on ColdQA. Our proposed OIL improves TTA to be more robust to variation in hyper-parameters and test distributions over time.
Anthology ID:
2022.findings-emnlp.460
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6179–6192
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.460
DOI:
10.18653/v1/2022.findings-emnlp.460
Bibkey:
Cite (ACL):
Hai Ye, Yuyang Ding, Juntao Li, and Hwee Tou Ng. 2022. Robust Question Answering against Distribution Shifts with Test-Time Adaption: An Empirical Study. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 6179–6192, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Robust Question Answering against Distribution Shifts with Test-Time Adaption: An Empirical Study (Ye et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.460.pdf
Video:
 https://aclanthology.org/2022.findings-emnlp.460.mp4