When LRP Diverges from Leave-One-Out in Transformers

Weiqiu You, Siqi Zeng, Yao-Hung Hubert Tsai, Makoto Yamada, Han Zhao


Abstract
Leave-One-Out (LOO) provides an intuitive measure of feature importance but is computationally prohibitive. While Layer-Wise Relevance Propagation (LRP) offers a potentially efficient alternative, its axiomatic soundness in modern Transformers remains under-examined. In this work, we first show that the bilinear propagation rules used in recent advances of AttnLRP violate implementation invariance. We prove this analytically and confirm it empirically in linear attention layers. Second, we also revisit CP-LRP as a diagnostic baseline and find that bypassing relevance propagation through the softmax layer—back-propagating relevance only through the value matrices—significantly improves alignment with LOO, particularly in the middle-to-late Transformer layers. Overall, our results suggest that (i) bilinear factorization sensitivity and (ii) softmax propagation error potentially jointly undermine LRP’s ability to approximate LOO in Transformers.
Anthology ID:
2025.blackboxnlp-1.10
Volume:
Proceedings of the 8th BlackboxNLP Workshop: Analyzing and Interpreting Neural Networks for NLP
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Yonatan Belinkov, Aaron Mueller, Najoung Kim, Hosein Mohebbi, Hanjie Chen, Dana Arad, Gabriele Sarti
Venues:
BlackboxNLP | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
176–188
Language:
URL:
https://aclanthology.org/2025.blackboxnlp-1.10/
DOI:
Bibkey:
Cite (ACL):
Weiqiu You, Siqi Zeng, Yao-Hung Hubert Tsai, Makoto Yamada, and Han Zhao. 2025. When LRP Diverges from Leave-One-Out in Transformers. In Proceedings of the 8th BlackboxNLP Workshop: Analyzing and Interpreting Neural Networks for NLP, pages 176–188, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
When LRP Diverges from Leave-One-Out in Transformers (You et al., BlackboxNLP 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.blackboxnlp-1.10.pdf