Data Processing Matters: SRPH-Konvergen AI’s Machine Translation System for WMT’21

Lintang Sutawika, Jan Christian Blaise Cruz


Abstract
In this paper, we describe the submission of the joint Samsung Research Philippines-Konvergen AI team for the WMT’21 Large Scale Multilingual Translation Task - Small Track 2. We submit a standard Seq2Seq Transformer model to the shared task without any training or architecture tricks, relying mainly on the strength of our data preprocessing techniques to boost performance. Our final submission model scored 22.92 average BLEU on the FLORES-101 devtest set, and scored 22.97 average BLEU on the contest’s hidden test set, ranking us sixth overall. Despite using only a standard Transformer, our model ranked first in Indonesian to Javanese, showing that data preprocessing matters equally, if not more, than cutting edge model architectures and training techniques.
Anthology ID:
2021.wmt-1.52
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Editors:
Loic Barrault, Ondrej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussa, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Tom Kocmi, Andre Martins, Makoto Morishita, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
431–438
Language:
URL:
https://aclanthology.org/2021.wmt-1.52
DOI:
Bibkey:
Cite (ACL):
Lintang Sutawika and Jan Christian Blaise Cruz. 2021. Data Processing Matters: SRPH-Konvergen AI’s Machine Translation System for WMT’21. In Proceedings of the Sixth Conference on Machine Translation, pages 431–438, Online. Association for Computational Linguistics.
Cite (Informal):
Data Processing Matters: SRPH-Konvergen AI’s Machine Translation System for WMT’21 (Sutawika & Cruz, WMT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wmt-1.52.pdf
Data
FLoRes-101