The Architectural Bottleneck Principle

Tiago Pimentel, Josef Valvoda, Niklas Stoehr, Ryan Cotterell


Abstract
In this paper, we seek to measure how much information a component in a neural network could extract from the representations fed into it. Our work stands in contrast to prior probing work, most of which investigates how much information a model's representations contain. This shift in perspective leads us to propose a new principle for probing, the architectural bottleneck principle: In order to estimate how much information a given component could extract, a probe should look exactly like the component. Relying on this principle, we estimate how much syntactic information is available to transformers through our attentional probe, a probe that exactly resembles a transformer's self-attention head. Experimentally, we find that, in three models (BERT, ALBERT, and RoBERTa), a sentence's syntax tree is mostly extractable by our probe, suggesting these models have access to syntactic information while composing their contextual representations. Whether this information is actually used by these models, however, remains an open question..
Anthology ID:
2022.emnlp-main.788
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
11459–11472
Language:
URL:
https://aclanthology.org/2022.emnlp-main.788
DOI:
10.18653/v1/2022.emnlp-main.788
Bibkey:
Cite (ACL):
Tiago Pimentel, Josef Valvoda, Niklas Stoehr, and Ryan Cotterell. 2022. The Architectural Bottleneck Principle. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 11459–11472, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
The Architectural Bottleneck Principle (Pimentel et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.788.pdf