Helping the Weak Makes You Strong: Simple Multi-Task Learning Improves Non-Autoregressive Translators

Xinyou Wang, Zaixiang Zheng, Shujian Huang


Abstract
Recently, non-autoregressive (NAR) neural machine translation models have received increasing attention due to their efficient parallel decoding. However, the probabilistic framework of NAR models necessitates conditional independence assumption on target sequences, falling short of characterizing human language data. This drawback results in less informative learning signals for NAR models under conventional MLE training, thereby yielding unsatisfactory accuracy compared to their autoregressive (AR) counterparts. In this paper, we propose a simple and model-agnostic multi-task learning framework to provide more informative learning signals. During training stage, we introduce a set of sufficiently weak AR decoders that solely rely on the information provided by NAR decoder to make prediction, forcing the NAR decoder to become stronger or else it will be unable to support its weak AR partners. Experiments on WMT and IWSLT datasets show that our approach can consistently improve accuracy of multiple NAR baselines without adding any additional decoding overhead.
Anthology ID:
2022.emnlp-main.371
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5513–5519
Language:
URL:
https://aclanthology.org/2022.emnlp-main.371
DOI:
10.18653/v1/2022.emnlp-main.371
Bibkey:
Cite (ACL):
Xinyou Wang, Zaixiang Zheng, and Shujian Huang. 2022. Helping the Weak Makes You Strong: Simple Multi-Task Learning Improves Non-Autoregressive Translators. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 5513–5519, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Helping the Weak Makes You Strong: Simple Multi-Task Learning Improves Non-Autoregressive Translators (Wang et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.371.pdf