Systematicity Emerges in Transformers when Abstract Grammatical Roles Guide Attention

Ayush K Chakravarthy, Jacob Labe Russin, Randall O’Reilly


Abstract
Systematicity is thought to be a key inductive bias possessed by humans that is lacking in standard natural language processing systems such as those utilizing transformers. In this work, we investigate the extent to which the failure of transformers on systematic generalization tests can be attributed to a lack of linguistic abstraction in its attention mechanism. We develop a novel modification to the transformer by implementing two separate input streams: a role stream controls the attention distributions (i.e., queries and keys) at each layer, and a filler stream determines the values. Our results show that when abstract role labels are assigned to input sequences and provided to the role stream, systematic generalization is improved.
Anthology ID:
2022.naacl-srw.1
Volume:
Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop
Month:
July
Year:
2022
Address:
Hybrid: Seattle, Washington + Online
Editors:
Daphne Ippolito, Liunian Harold Li, Maria Leonor Pacheco, Danqi Chen, Nianwen Xue
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–8
Language:
URL:
https://aclanthology.org/2022.naacl-srw.1
DOI:
10.18653/v1/2022.naacl-srw.1
Bibkey:
Cite (ACL):
Ayush K Chakravarthy, Jacob Labe Russin, and Randall O’Reilly. 2022. Systematicity Emerges in Transformers when Abstract Grammatical Roles Guide Attention. In Proceedings of the 2022 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Student Research Workshop, pages 1–8, Hybrid: Seattle, Washington + Online. Association for Computational Linguistics.
Cite (Informal):
Systematicity Emerges in Transformers when Abstract Grammatical Roles Guide Attention (Chakravarthy et al., NAACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.naacl-srw.1.pdf
Video:
 https://aclanthology.org/2022.naacl-srw.1.mp4
Data
SCAN