Continuous Model Improvement for Language Understanding with Machine Translation

Abdalghani Abujabal, Claudio Delli Bovi, Sungho Ryu, Turan Gojayev, Fabian Triefenbach, Yannick Versley


Abstract
Scaling conversational personal assistants to a multitude of languages puts high demands on collecting and labelling data, a setting in which cross-lingual learning techniques can help to reconcile the need for well-performing Natural Language Understanding (NLU) with a desideratum to support many languages without incurring unacceptable cost. In this work, we show that automatically annotating unlabeled utterances using Machine Translation in an offline fashion and adding them to the training data can improve performance for existing NLU features for low-resource languages, where a straightforward translate-test approach as considered in existing literature would fail the latency requirements of a live environment. We demonstrate the effectiveness of our method with intrinsic and extrinsic evaluation using a real-world commercial dialog system in German. Beyond an intrinsic evaluation, where 56% of the resulting automatically labeled utterances had a perfect match with ground-truth labels, we see significant performance improvements in an extrinsic evaluation settings when manual labeled data is available in small quantities.
Anthology ID:
2021.naacl-industry.8
Volume:
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers
Month:
June
Year:
2021
Address:
Online
Editors:
Young-bum Kim, Yunyao Li, Owen Rambow
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
56–62
Language:
URL:
https://aclanthology.org/2021.naacl-industry.8
DOI:
10.18653/v1/2021.naacl-industry.8
Bibkey:
Cite (ACL):
Abdalghani Abujabal, Claudio Delli Bovi, Sungho Ryu, Turan Gojayev, Fabian Triefenbach, and Yannick Versley. 2021. Continuous Model Improvement for Language Understanding with Machine Translation. In Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies: Industry Papers, pages 56–62, Online. Association for Computational Linguistics.
Cite (Informal):
Continuous Model Improvement for Language Understanding with Machine Translation (Abujabal et al., NAACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.naacl-industry.8.pdf
Video:
 https://aclanthology.org/2021.naacl-industry.8.mp4