Dependency-Based Relative Positional Encoding for Transformer NMT

Yutaro Omote, Akihiro Tamura, Takashi Ninomiya


Abstract
This paper proposes a new Transformer neural machine translation model that incorporates syntactic distances between two source words into the relative position representations of the self-attention mechanism. In particular, the proposed model encodes pair-wise relative depths on a source dependency tree, which are differences between the depths of the two source words, in the encoder’s self-attention. The experiments show that our proposed model achieves 0.5 point gain in BLEU on the Asian Scientific Paper Excerpt Corpus Japanese-to-English translation task.
Anthology ID:
R19-1099
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019)
Month:
September
Year:
2019
Address:
Varna, Bulgaria
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
854–861
Language:
URL:
https://aclanthology.org/R19-1099
DOI:
10.26615/978-954-452-056-4_099
Bibkey:
Cite (ACL):
Yutaro Omote, Akihiro Tamura, and Takashi Ninomiya. 2019. Dependency-Based Relative Positional Encoding for Transformer NMT. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2019), pages 854–861, Varna, Bulgaria. INCOMA Ltd..
Cite (Informal):
Dependency-Based Relative Positional Encoding for Transformer NMT (Omote et al., RANLP 2019)
Copy Citation:
PDF:
https://aclanthology.org/R19-1099.pdf