The Functional Relevance of Probed Information: A Case Study

Michael Hanna, Roberto Zamparelli, David Mareček


Abstract
Recent studies have shown that transformer models like BERT rely on number information encoded in their representations of sentences’ subjects and head verbs when performing subject-verb agreement. However, probing experiments suggest that subject number is also encoded in the representations of all words in such sentences. In this paper, we use causal interventions to show that BERT only uses the subject plurality information encoded in its representations of the subject and words that agree with it in number. We also demonstrate that current probing metrics are unable to determine which words’ representations contain functionally relevant information. This both provides a revised view of subject-verb agreement in language models, and suggests potential pitfalls for current probe usage and evaluation.
Anthology ID:
2023.eacl-main.58
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
835–848
Language:
URL:
https://aclanthology.org/2023.eacl-main.58
DOI:
10.18653/v1/2023.eacl-main.58
Bibkey:
Cite (ACL):
Michael Hanna, Roberto Zamparelli, and David Mareček. 2023. The Functional Relevance of Probed Information: A Case Study. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 835–848, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
The Functional Relevance of Probed Information: A Case Study (Hanna et al., EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.58.pdf
Video:
 https://aclanthology.org/2023.eacl-main.58.mp4