Increasing Robustness to Spurious Correlations using Forgettable Examples

Yadollah Yaghoobzadeh, Soroush Mehri, Remi Tachet des Combes, T. J. Hazen, Alessandro Sordoni


Abstract
Neural NLP models tend to rely on spurious correlations between labels and input features to perform their tasks. Minority examples, i.e., examples that contradict the spurious correlations present in the majority of data points, have been shown to increase the out-of-distribution generalization of pre-trained language models. In this paper, we first propose using example forgetting to find minority examples without prior knowledge of the spurious correlations present in the dataset. Forgettable examples are instances either learned and then forgotten during training or never learned. We show empirically how these examples are related to minorities in our training sets. Then, we introduce a new approach to robustify models by fine-tuning our models twice, first on the full training data and second on the minorities only. We obtain substantial improvements in out-of-distribution generalization when applying our approach to the MNLI, QQP and FEVER datasets.
Anthology ID:
2021.eacl-main.291
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3319–3332
Language:
URL:
https://aclanthology.org/2021.eacl-main.291
DOI:
10.18653/v1/2021.eacl-main.291
Bibkey:
Cite (ACL):
Yadollah Yaghoobzadeh, Soroush Mehri, Remi Tachet des Combes, T. J. Hazen, and Alessandro Sordoni. 2021. Increasing Robustness to Spurious Correlations using Forgettable Examples. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 3319–3332, Online. Association for Computational Linguistics.
Cite (Informal):
Increasing Robustness to Spurious Correlations using Forgettable Examples (Yaghoobzadeh et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.291.pdf
Data
MultiNLI