Syntax-Based Attention Masking for Neural Machine Translation

Colin McDonald, David Chiang


Abstract
We present a simple method for extending transformers to source-side trees. We define a number of masks that limit self-attention based on relationships among tree nodes, and we allow each attention head to learn which mask or masks to use. On translation from English to various low-resource languages, and translation in both directions between English and German, our method always improves over simple linearization of the source-side parse tree and almost always improves over a sequence-to-sequence baseline, by up to +2.1 BLEU.
Anthology ID:
2021.naacl-srw.7
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop
Month:
June
Year:
2021
Address:
Online
Editors:
Esin Durmus, Vivek Gupta, Nelson Liu, Nanyun Peng, Yu Su
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
47–52
Language:
URL:
https://aclanthology.org/2021.naacl-srw.7
DOI:
10.18653/v1/2021.naacl-srw.7
Bibkey:
Cite (ACL):
Colin McDonald and David Chiang. 2021. Syntax-Based Attention Masking for Neural Machine Translation. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Student Research Workshop, pages 47–52, Online. Association for Computational Linguistics.
Cite (Informal):
Syntax-Based Attention Masking for Neural Machine Translation (McDonald & Chiang, NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-srw.7.pdf
Video:
 https://aclanthology.org/2021.naacl-srw.7.mp4