Neural Machine Translation Techniques for Named Entity Transliteration

Roman Grundkiewicz, Kenneth Heafield


Abstract
Transliterating named entities from one language into another can be approached as neural machine translation (NMT) problem, for which we use deep attentional RNN encoder-decoder models. To build a strong transliteration system, we apply well-established techniques from NMT, such as dropout regularization, model ensembling, rescoring with right-to-left models, and back-translation. Our submission to the NEWS 2018 Shared Task on Named Entity Transliteration ranked first in several tracks.
Anthology ID:
W18-2413
Volume:
Proceedings of the Seventh Named Entities Workshop
Month:
July
Year:
2018
Address:
Melbourne, Australia
Editors:
Nancy Chen, Rafael E. Banchs, Xiangyu Duan, Min Zhang, Haizhou Li
Venue:
NEWS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
89–94
Language:
URL:
https://aclanthology.org/W18-2413
DOI:
10.18653/v1/W18-2413
Bibkey:
Cite (ACL):
Roman Grundkiewicz and Kenneth Heafield. 2018. Neural Machine Translation Techniques for Named Entity Transliteration. In Proceedings of the Seventh Named Entities Workshop, pages 89–94, Melbourne, Australia. Association for Computational Linguistics.
Cite (Informal):
Neural Machine Translation Techniques for Named Entity Transliteration (Grundkiewicz & Heafield, NEWS 2018)
Copy Citation:
PDF:
https://aclanthology.org/W18-2413.pdf
Code
 snukky/news-translit-nmt