IAEval: A Comprehensive Evaluation of Instance Attribution on Natural Language Understanding

Peijian Gu, Yaozong Shen, Lijie Wang, Quan Wang, Hua Wu, Zhendong Mao


Abstract
Instance attribution (IA) aims to identify the training instances leading to the prediction of a test example, helping researchers understand the dataset better and optimize data processing. While many IA methods have been proposed recently, how to evaluate them still remains open. Previous evaluations of IA only focus on one or two dimensions and are not comprehensive. In this work, we introduce IAEval for IA methods, a systematic and comprehensive evaluation scheme covering four significant requirements: sufficiency, completeness, stability and plausibility. We elaborately design novel metrics to measure these requirements for the first time. Three representative IA methods are evaluated under IAEval on four natural language understanding datasets. Extensive experiments confirmed the effectiveness of IAEval and exhibited its ability to provide comprehensive comparison among IA methods. With IAEval, researchers can choose the most suitable IA methods for applications like model debugging.
Anthology ID:
2023.findings-emnlp.801
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11966–11977
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.801
DOI:
10.18653/v1/2023.findings-emnlp.801
Bibkey:
Cite (ACL):
Peijian Gu, Yaozong Shen, Lijie Wang, Quan Wang, Hua Wu, and Zhendong Mao. 2023. IAEval: A Comprehensive Evaluation of Instance Attribution on Natural Language Understanding. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 11966–11977, Singapore. Association for Computational Linguistics.
Cite (Informal):
IAEval: A Comprehensive Evaluation of Instance Attribution on Natural Language Understanding (Gu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.801.pdf