HW-TSC’s Participation in the WMT 2021 Efficiency Shared Task

Hengchao Shang, Ting Hu, Daimeng Wei, Zongyao Li, Jianfei Feng, ZhengZhe Yu, Jiaxin Guo, Shaojun Li, Lizhi Lei, ShiMin Tao, Hao Yang, Jun Yao, Ying Qin


Abstract
This paper presents the submission of Huawei Translation Services Center (HW-TSC) to WMT 2021 Efficiency Shared Task. We explore the sentence-level teacher-student distillation technique and train several small-size models that find a balance between efficiency and quality. Our models feature deep encoder, shallow decoder and light-weight RNN with SSRU layer. We use Huawei Noah’s Bolt, an efficient and light-weight library for on-device inference. Leveraging INT8 quantization, self-defined General Matrix Multiplication (GEMM) operator, shortlist, greedy search and caching, we submit four small-size and efficient translation models with high translation quality for the one CPU core latency track.
Anthology ID:
2021.wmt-1.75
Volume:
Proceedings of the Sixth Conference on Machine Translation
Month:
November
Year:
2021
Address:
Online
Editors:
Loic Barrault, Ondrej Bojar, Fethi Bougares, Rajen Chatterjee, Marta R. Costa-jussa, Christian Federmann, Mark Fishel, Alexander Fraser, Markus Freitag, Yvette Graham, Roman Grundkiewicz, Paco Guzman, Barry Haddow, Matthias Huck, Antonio Jimeno Yepes, Philipp Koehn, Tom Kocmi, Andre Martins, Makoto Morishita, Christof Monz
Venue:
WMT
SIG:
SIGMT
Publisher:
Association for Computational Linguistics
Note:
Pages:
781–786
Language:
URL:
https://aclanthology.org/2021.wmt-1.75
DOI:
Bibkey:
Cite (ACL):
Hengchao Shang, Ting Hu, Daimeng Wei, Zongyao Li, Jianfei Feng, ZhengZhe Yu, Jiaxin Guo, Shaojun Li, Lizhi Lei, ShiMin Tao, Hao Yang, Jun Yao, and Ying Qin. 2021. HW-TSC’s Participation in the WMT 2021 Efficiency Shared Task. In Proceedings of the Sixth Conference on Machine Translation, pages 781–786, Online. Association for Computational Linguistics.
Cite (Informal):
HW-TSC’s Participation in the WMT 2021 Efficiency Shared Task (Shang et al., WMT 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.wmt-1.75.pdf