Adaptively Scheduled Multitask Learning: The Case of Low-Resource Neural Machine Translation

Poorya Zaremoodi, Gholamreza Haffari


Abstract
Neural Machine Translation (NMT), a data-hungry technology, suffers from the lack of bilingual data in low-resource scenarios. Multitask learning (MTL) can alleviate this issue by injecting inductive biases into NMT, using auxiliary syntactic and semantic tasks. However, an effective training schedule is required to balance the importance of tasks to get the best use of the training signal. The role of training schedule becomes even more crucial in biased-MTL where the goal is to improve one (or a subset) of tasks the most, e.g. translation quality. Current approaches for biased-MTL are based on brittle hand-engineered heuristics that require trial and error, and should be (re-)designed for each learning scenario. To the best of our knowledge, ours is the first work on adaptively and dynamically changing the training schedule in biased-MTL. We propose a rigorous approach for automatically reweighing the training data of the main and auxiliary tasks throughout the training process based on their contributions to the generalisability of the main NMT task. Our experiments on translating from English to Vietnamese/Turkish/Spanish show improvements of up to +1.2 BLEU points, compared to strong baselines. Additionally, our analyses shed light on the dynamic of needs throughout the training of NMT: from syntax to semantic.
Anthology ID:
D19-5618
Volume:
Proceedings of the 3rd Workshop on Neural Generation and Translation
Month:
November
Year:
2019
Address:
Hong Kong
Editors:
Alexandra Birch, Andrew Finch, Hiroaki Hayashi, Ioannis Konstas, Thang Luong, Graham Neubig, Yusuke Oda, Katsuhito Sudoh
Venue:
NGT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
177–186
Language:
URL:
https://aclanthology.org/D19-5618
DOI:
10.18653/v1/D19-5618
Bibkey:
Cite (ACL):
Poorya Zaremoodi and Gholamreza Haffari. 2019. Adaptively Scheduled Multitask Learning: The Case of Low-Resource Neural Machine Translation. In Proceedings of the 3rd Workshop on Neural Generation and Translation, pages 177–186, Hong Kong. Association for Computational Linguistics.
Cite (Informal):
Adaptively Scheduled Multitask Learning: The Case of Low-Resource Neural Machine Translation (Zaremoodi & Haffari, NGT 2019)
Copy Citation:
PDF:
https://aclanthology.org/D19-5618.pdf