Using Interpretation Methods for Model Enhancement

Zhuo Chen, Chengyue Jiang, Kewei Tu


Abstract
In the age of neural natural language processing, there are plenty of works trying to derive interpretations of neural models. Intuitively, when gold rationales exist during training, one can additionally train the model to match its interpretation with the rationales. However, this intuitive idea has not been fully explored. In this paper, we propose a framework of utilizing interpretation methods and gold rationales to enhance models. Our framework is very general in the sense that it can incorporate various interpretation methods. Previously proposed gradient-based methods can be shown as an instance of our framework. We also propose two novel instances utilizing two other types of interpretation methods, erasure/replace-based and extractor-based methods, for model enhancement. We conduct comprehensive experiments on a variety of tasks. Experimental results show that our framework is effective especially in low-resource settings in enhancing models with various interpretation methods, and our two newly-proposed methods outperform gradient-based methods in most settings. Code is available at https://github.com/Chord-Chen-30/UIMER.
Anthology ID:
2023.emnlp-main.28
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
424–438
Language:
URL:
https://aclanthology.org/2023.emnlp-main.28
DOI:
10.18653/v1/2023.emnlp-main.28
Bibkey:
Cite (ACL):
Zhuo Chen, Chengyue Jiang, and Kewei Tu. 2023. Using Interpretation Methods for Model Enhancement. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 424–438, Singapore. Association for Computational Linguistics.
Cite (Informal):
Using Interpretation Methods for Model Enhancement (Chen et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.28.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.28.mp4