Continual Knowledge Distillation for Neural Machine Translation

Yuanchi Zhang, Peng Li, Maosong Sun, Yang Liu


Abstract
While many parallel corpora are not publicly accessible for data copyright, data privacy and competitive differentiation reasons, trained translation models are increasingly available on open platforms. In this work, we propose a method called continual knowledge distillation to take advantage of existing translation models to improve one model of interest. The basic idea is to sequentially transfer knowledge from each trained model to the distilled model. Extensive experiments on Chinese-English and German-English datasets show that our method achieves significant and consistent improvements over strong baselines under both homogeneous and heterogeneous trained model settings and is robust to malicious models.
Anthology ID:
2023.acl-long.443
Volume:
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7978–7996
Language:
URL:
https://aclanthology.org/2023.acl-long.443
DOI:
10.18653/v1/2023.acl-long.443
Bibkey:
Cite (ACL):
Yuanchi Zhang, Peng Li, Maosong Sun, and Yang Liu. 2023. Continual Knowledge Distillation for Neural Machine Translation. In Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7978–7996, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Continual Knowledge Distillation for Neural Machine Translation (Zhang et al., ACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.acl-long.443.pdf
Video:
 https://aclanthology.org/2023.acl-long.443.mp4