%0 Conference Proceedings %T Elastic weight consolidation for better bias inoculation %A Thorne, James %A Vlachos, Andreas %Y Merlo, Paola %Y Tiedemann, Jorg %Y Tsarfaty, Reut %S Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume %D 2021 %8 April %I Association for Computational Linguistics %C Online %F thorne-vlachos-2021-elastic %X The biases present in training datasets have been shown to affect models for sentence pair classification tasks such as natural language inference (NLI) and fact verification. While fine-tuning models on additional data has been used to mitigate them, a common issue is that of catastrophic forgetting of the original training dataset. In this paper, we show that elastic weight consolidation (EWC) allows fine-tuning of models to mitigate biases while being less susceptible to catastrophic forgetting. In our evaluation on fact verification and NLI stress tests, we show that fine-tuning with EWC dominates standard fine-tuning, yielding models with lower levels of forgetting on the original (biased) dataset for equivalent gains in accuracy on the fine-tuning (unbiased) dataset. %R 10.18653/v1/2021.eacl-main.82 %U https://aclanthology.org/2021.eacl-main.82 %U https://doi.org/10.18653/v1/2021.eacl-main.82 %P 957-964