ExSum: From Local Explanations to Model Understanding

Yilun Zhou, Marco Tulio Ribeiro, Julie Shah


Abstract
Interpretability methods are developed to understand the working mechanisms of black-box models, which is crucial to their responsible deployment. Fulfilling this goal requires both that the explanations generated by these methods are correct and that people can easily and reliably understand them. While the former has been addressed in prior work, the latter is often overlooked, resulting in informal model understanding derived from a handful of local explanations. In this paper, we introduce explanation summary (ExSum), a mathematical framework for quantifying model understanding, and propose metrics for its quality assessment. On two domains, ExSum highlights various limitations in the current practice, helps develop accurate model understanding, and reveals easily overlooked properties of the model. We also connect understandability to other properties of explanations such as human alignment, robustness, and counterfactual similarity and plausibility.
Anthology ID:
2022.naacl-main.392
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5359–5378
Language:
URL:
https://aclanthology.org/2022.naacl-main.392
DOI:
10.18653/v1/2022.naacl-main.392
Bibkey:
Cite (ACL):
Yilun Zhou, Marco Tulio Ribeiro, and Julie Shah. 2022. ExSum: From Local Explanations to Model Understanding. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, pages 5359–5378, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
ExSum: From Local Explanations to Model Understanding (Zhou et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-main.392.pdf
Video:
 https://aclanthology.org/2022.naacl-main.392.mp4
Code
 YilunZhou/ExSum