UCSYNLP-Lab Machine Translation Systems for WAT 2019

Yimon ShweSin, Win Pa Pa, KhinMar Soe


Abstract
This paper describes the UCSYNLP-Lab submission to WAT 2019 for Myanmar-English translation tasks in both direction. We have used the neural machine translation systems with attention model and utilized the UCSY-corpus and ALT corpus. In NMT with attention model, we use the word segmentation level as well as syllable segmentation level. Especially, we made the UCSY-corpus to be cleaned in WAT 2019. Therefore, the UCSY corpus for WAT 2019 is not identical to those used in WAT 2018. Experiments show that the translation systems can produce the substantial improvements.
Anthology ID:
D19-5226
Volume:
Proceedings of the 6th Workshop on Asian Translation
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Toshiaki Nakazawa, Chenchen Ding, Raj Dabre, Anoop Kunchukuttan, Nobushige Doi, Yusuke Oda, Ondřej Bojar, Shantipriya Parida, Isao Goto, Hidaya Mino
Venue:
WAT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
195–199
Language:
URL:
https://aclanthology.org/D19-5226
DOI:
10.18653/v1/D19-5226
Bibkey:
Cite (ACL):
Yimon ShweSin, Win Pa Pa, and KhinMar Soe. 2019. UCSYNLP-Lab Machine Translation Systems for WAT 2019. In Proceedings of the 6th Workshop on Asian Translation, pages 195–199, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
UCSYNLP-Lab Machine Translation Systems for WAT 2019 (ShweSin et al., WAT 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5226.pdf