Syntax-guided Localized Self-attention by Constituency Syntactic Distance

Shengyuan Hou, Jushi Kai, Haotian Xue, Bingyu Zhu, Bo Yuan, Longtao Huang, Xinbing Wang, Zhouhan Lin


Abstract
Recent works have revealed that Transformers are implicitly learning the syntactic information in its lower layers from data, albeit is highly dependent on the quality and scale of the training data. However, learning syntactic information from data is not necessary if we can leverage an external syntactic parser, which provides better parsing quality with well-defined syntactic structures. This could potentially improve Transformer’s performance and sample efficiency. In this work, we propose a syntax-guided localized self-attention for Transformer that allows directly incorporating grammar structures from an external constituency parser. It prohibits the attention mechanism to overweight the grammatically distant tokens over close ones. Experimental results show that our model could consistently improve translation performance on a variety of machine translation datasets, ranging from small to large dataset sizes, and with different source languages.
Anthology ID:
2022.findings-emnlp.173
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2334–2341
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.173
DOI:
10.18653/v1/2022.findings-emnlp.173
Bibkey:
Cite (ACL):
Shengyuan Hou, Jushi Kai, Haotian Xue, Bingyu Zhu, Bo Yuan, Longtao Huang, Xinbing Wang, and Zhouhan Lin. 2022. Syntax-guided Localized Self-attention by Constituency Syntactic Distance. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 2334–2341, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Syntax-guided Localized Self-attention by Constituency Syntactic Distance (Hou et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.173.pdf