NMT or SMT: Case Study of a Narrow-domain English-Latvian Post-editing Project

Inguna Skadiņa, Mārcis Pinnis


Abstract
The recent technological shift in machine translation from statistical machine translation (SMT) to neural machine translation (NMT) raises the question of the strengths and weaknesses of NMT. In this paper, we present an analysis of NMT and SMT systems’ outputs from narrow domain English-Latvian MT systems that were trained on a rather small amount of data. We analyze post-edits produced by professional translators and manually annotated errors in these outputs. Analysis of post-edits allowed us to conclude that both approaches are comparably successful, allowing for an increase in translators’ productivity, with the NMT system showing slightly worse results. Through the analysis of annotated errors, we found that NMT translations are more fluent than SMT translations. However, errors related to accuracy, especially, mistranslation and omission errors, occur more often in NMT outputs. The word form errors, that characterize the morphological richness of Latvian, are frequent for both systems, but slightly fewer in NMT outputs.
Anthology ID:
I17-1038
Volume:
Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
November
Year:
2017
Address:
Taipei, Taiwan
Editors:
Greg Kondrak, Taro Watanabe
Venue:
IJCNLP
SIG:
Publisher:
Asian Federation of Natural Language Processing
Note:
Pages:
373–383
Language:
URL:
https://aclanthology.org/I17-1038
DOI:
Bibkey:
Cite (ACL):
Inguna Skadiņa and Mārcis Pinnis. 2017. NMT or SMT: Case Study of a Narrow-domain English-Latvian Post-editing Project. In Proceedings of the Eighth International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 373–383, Taipei, Taiwan. Asian Federation of Natural Language Processing.
Cite (Informal):
NMT or SMT: Case Study of a Narrow-domain English-Latvian Post-editing Project (Skadiņa & Pinnis, IJCNLP 2017)
Copy Citation:
PDF:
https://aclanthology.org/I17-1038.pdf
Data
WMT 2016