Approximate Attributions for Off-the-Shelf Siamese Transformers

Lucas Moeller, Dmitry Nikolaev, Sebastian Padó


Abstract
Siamese encoders such as sentence transformers are among the least understood deep models.Established attribution methods cannot tackle this model class since it compares two inputs rather than processing a single one. To address this gap, we have recently proposed an attribution method specifically for Siamese encoders (Möller et al., 2023). However, it requires models to be adjusted and fine-tuned and therefore cannot be directly applied to off-the-shelf models. In this work, we reassess these restrictions and propose (i) a model with exact attribution ability that retains the original model’s predictive performance and (ii) a way to compute approximate attributions for off-the-shelf models.We extensively compare approximate and exact attributions and use them to analyze the models’ attendance to different linguistic aspects. We gain insights into which syntactic roles Siamese transformers attend to, confirm that they mostly ignore negation, explore how they judge semantically opposite adjectives, and find that they exhibit lexical bias.
Anthology ID:
2024.eacl-long.125
Volume:
Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2059–2071
Language:
URL:
https://aclanthology.org/2024.eacl-long.125
DOI:
Bibkey:
Cite (ACL):
Lucas Moeller, Dmitry Nikolaev, and Sebastian Padó. 2024. Approximate Attributions for Off-the-Shelf Siamese Transformers. In Proceedings of the 18th Conference of the European Chapter of the Association for Computational Linguistics (Volume 1: Long Papers), pages 2059–2071, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Approximate Attributions for Off-the-Shelf Siamese Transformers (Moeller et al., EACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.eacl-long.125.pdf
Software:
 2024.eacl-long.125.software.zip