The elephant in the interpretability room: Why use attention as explanation when we have saliency methods?

Jasmijn Bastings, Katja Filippova


Abstract
There is a recent surge of interest in using attention as explanation of model predictions, with mixed evidence on whether attention can be used as such. While attention conveniently gives us one weight per input token and is easily extracted, it is often unclear toward what goal it is used as explanation. We find that often that goal, whether explicitly stated or not, is to find out what input tokens are the most relevant to a prediction, and that the implied user for the explanation is a model developer. For this goal and user, we argue that input saliency methods are better suited, and that there are no compelling reasons to use attention, despite the coincidence that it provides a weight for each input. With this position paper, we hope to shift some of the recent focus on attention to saliency methods, and for authors to clearly state the goal and user for their explanations.
Anthology ID:
2020.blackboxnlp-1.14
Volume:
Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP
Month:
November
Year:
2020
Address:
Online
Editors:
Afra Alishahi, Yonatan Belinkov, Grzegorz Chrupała, Dieuwke Hupkes, Yuval Pinter, Hassan Sajjad
Venue:
BlackboxNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
149–155
Language:
URL:
https://aclanthology.org/2020.blackboxnlp-1.14
DOI:
10.18653/v1/2020.blackboxnlp-1.14
Bibkey:
Cite (ACL):
Jasmijn Bastings and Katja Filippova. 2020. The elephant in the interpretability room: Why use attention as explanation when we have saliency methods?. In Proceedings of the Third BlackboxNLP Workshop on Analyzing and Interpreting Neural Networks for NLP, pages 149–155, Online. Association for Computational Linguistics.
Cite (Informal):
The elephant in the interpretability room: Why use attention as explanation when we have saliency methods? (Bastings & Filippova, BlackboxNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.blackboxnlp-1.14.pdf
Video:
 https://slideslive.com/38939764