Improving Zero-Shot Multilingual Text Generation via Iterative Distillation

Ernie Chang, Alex Marin, Vera Demberg


Abstract
The demand for multilingual dialogue systems often requires a costly labeling process, where human translators derive utterances in low resource languages from resource rich language annotation. To this end, we explore leveraging the inductive biases for target languages learned by numerous pretrained teacher models by transferring them to student models via sequence-level knowledge distillation. By assuming no target language text, the both the teacher and student models need to learn from the target distribution in a few/zero-shot manner. On the MultiATIS++ benchmark, we explore the effectiveness of our proposed technique to derive the multilingual text for 6 languages, using only the monolingual English data and the pretrained models. We show that training on the synthetic multilingual generation outputs yields close performance to training on human annotations in both slot F1 and intent accuracy; the synthetic text also scores high in naturalness and correctness based on human evaluation.
Anthology ID:
2022.coling-1.513
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
5876–5881
Language:
URL:
https://aclanthology.org/2022.coling-1.513
DOI:
Bibkey:
Cite (ACL):
Ernie Chang, Alex Marin, and Vera Demberg. 2022. Improving Zero-Shot Multilingual Text Generation via Iterative Distillation. In Proceedings of the 29th International Conference on Computational Linguistics, pages 5876–5881, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Improving Zero-Shot Multilingual Text Generation via Iterative Distillation (Chang et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.513.pdf