Zixuan Wang
2020
Tencent submission for WMT20 Quality Estimation Shared Task
Haijiang Wu
|
Zixuan Wang
|
Qingsong Ma
|
Xinjie Wen
|
Ruichen Wang
|
Xiaoli Wang
|
Yulin Zhang
|
Zhipeng Yao
|
Siyao Peng
Proceedings of the Fifth Conference on Machine Translation
This paper presents Tencent’s submission to the WMT20 Quality Estimation (QE) Shared Task: Sentence-Level Post-editing Effort for English-Chinese in Task 2. Our system ensembles two architectures, XLM-based and Transformer-based Predictor-Estimator models. For the XLM-based Predictor-Estimator architecture, the predictor produces two types of contextualized token representations, i.e., masked XLM and non-masked XLM; the LSTM-estimator and Transformer-estimator employ two effective strategies, top-K and multi-head attention, to enhance the sentence feature representation. For Transformer-based Predictor-Estimator architecture, we improve a top-performing model by conducting three modifications: using multi-decoding in machine translation module, creating a new model by replacing the transformer-based predictor with XLM-based predictor, and finally integrating two models by a weighted average. Our submission achieves a Pearson correlation of 0.664, ranking first (tied) on English-Chinese.
Search
Co-authors
- Haijiang Wu 1
- Qingsong Ma 1
- Xinjie Wen 1
- Ruichen Wang 1
- Xiaoli Wang 1
- show all...
Venues
- wmt1