Proto-lm: A Prototypical Network-Based Framework for Built-in Interpretability in Large Language Models

Sean Xie, Soroush Vosoughi, Saeed Hassanpour


Abstract
Large Language Models (LLMs) have significantly advanced the field of Natural Language Processing (NLP), but their lack of interpretability has been a major concern. Current methods for interpreting LLMs are post hoc, applied after inference time, and have limitations such as their focus on low-level features and lack of explainability at higher-level text units. In this work, we introduce proto-lm, a prototypical network-based white-box framework that allows LLMs to learn immediately interpretable embeddings during the fine-tuning stage while maintaining competitive performance. Our method’s applicability and interpretability are demonstrated through experiments on a wide range of NLP tasks, and our results indicate a new possibility of creating interpretable models without sacrificing performance. This novel approach to interpretability in LLMs can pave the way for more interpretable models without the need to sacrifice performance. We release our code at https://github.com/yx131/proto-lm.
Anthology ID:
2023.findings-emnlp.261
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3964–3979
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.261
DOI:
10.18653/v1/2023.findings-emnlp.261
Bibkey:
Cite (ACL):
Sean Xie, Soroush Vosoughi, and Saeed Hassanpour. 2023. Proto-lm: A Prototypical Network-Based Framework for Built-in Interpretability in Large Language Models. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 3964–3979, Singapore. Association for Computational Linguistics.
Cite (Informal):
Proto-lm: A Prototypical Network-Based Framework for Built-in Interpretability in Large Language Models (Xie et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.261.pdf