Transformer with Syntactic Position Encoding for Machine Translation

Yikuan Xie, Wenyong Wang, Mingqian Du, Qing He


Abstract
It has been widely recognized that syntax information can help end-to-end neural machine translation (NMT) systems to achieve better translation. In order to integrate dependency information into Transformer based NMT, existing approaches either exploit words’ local head-dependent relations, ignoring their non-local neighbors carrying important context; or approximate two words’ syntactic relation by their relative distance on the dependency tree, sacrificing exactness. To address these issues, we propose global positional encoding for dependency tree, a new scheme that facilitates syntactic relation modeling between any two words with keeping exactness and without immediate neighbor constraint. Experiment results on NC11 German→English, English→German and WMT English→German datasets show that our approach is more effective than the above two strategies. In addition, our experiments quantitatively show that compared with higher layers, lower layers of the model are more proper places to incorporate syntax information in terms of each layer’s preference to the syntactic pattern and the final performance.
Anthology ID:
2021.ranlp-1.172
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)
Month:
September
Year:
2021
Address:
Held Online
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
1536–1544
Language:
URL:
https://aclanthology.org/2021.ranlp-1.172
DOI:
Bibkey:
Cite (ACL):
Yikuan Xie, Wenyong Wang, Mingqian Du, and Qing He. 2021. Transformer with Syntactic Position Encoding for Machine Translation. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 1536–1544, Held Online. INCOMA Ltd..
Cite (Informal):
Transformer with Syntactic Position Encoding for Machine Translation (Xie et al., RANLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.ranlp-1.172.pdf