Alibaba Submission to the WMT20 Parallel Corpus Filtering Task

Jun Lu, Xin Ge, Yangbin Shi, Yuqi Zhang


Abstract
This paper describes the Alibaba Machine Translation Group submissions to the WMT 2020 Shared Task on Parallel Corpus Filtering and Alignment. In the filtering task, three main methods are applied to evaluate the quality of the parallel corpus, i.e. a) Dual Bilingual GPT-2 model, b) Dual Conditional Cross-Entropy Model and c) IBM word alignment model. The scores of these models are combined by using a positive-unlabeled (PU) learning model and a brute-force search to obtain additional gains. Besides, a few simple but efficient rules are adopted to evaluate the quality and the diversity of the corpus. In the alignment-filtering task, the extraction pipeline of bilingual sentence pairs includes the following steps: bilingual lexicon mining, language identification, sentence segmentation and sentence alignment. The final result shows that, in both filtering and alignment tasks, our system significantly outperforms the LASER-based system.
Anthology ID:
2020.wmt-1.111
Volume:
Proceedings of the Fifth Conference on Machine Translation
Month:
November
Year:
2020
Address:
Online
Editors:
Loïc Barrault, Ondřej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussà, Christian Federmann, Mark Fishel, Alexander Fraser, Yvette Graham, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, André Martins, Makoto Morishita, Christof Monz, Masaaki Nagata, Toshiaki Nakazawa, Matteo Negri
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
979–984
Language:
URL:
https://aclanthology.org/2020.wmt-1.111
DOI:
Bibkey:
Cite (ACL):
Jun Lu, Xin Ge, Yangbin Shi, and Yuqi Zhang. 2020. Alibaba Submission to the WMT20 Parallel Corpus Filtering Task. In Proceedings of the Fifth Conference on Machine Translation, pages 979–984, Online. Association for Computational Linguistics.
Cite (Informal):
Alibaba Submission to the WMT20 Parallel Corpus Filtering Task (Lu et al., WMT 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.wmt-1.111.pdf
Video:
 https://slideslive.com/38939580