Code-Switched Text Synthesis in Unseen Language Pairs

I-Hung Hsu, Avik Ray, Shubham Garg, Nanyun Peng, Jing Huang


Abstract
Existing efforts on text synthesis for code-switching mostly require training on code-switched texts in the target language pairs, limiting the deployment of the models to cases lacking code-switched data. In this work, we study the problem of synthesizing code-switched texts for language pairs absent from the training data. We introduce GLOSS, a model built on top of a pre-trained multilingual machine translation model (PMMTM) with an additional code-switching module. This module, either an adapter or extra prefixes, learns code-switching patterns from code-switched data during training, while the primary component of GLOSS, i.e., the PMMTM, is frozen. The design of only adjusting the code-switching module prevents our model from overfitting to the constrained training data for code-switching. Hence, GLOSS exhibits the ability to generalize and synthesize code-switched texts across a broader spectrum of language pairs. Additionally, we develop a self-training algorithm on target language pairs further to enhance the reliability of GLOSS. Automatic evaluations on four language pairs show that GLOSS achieves at least 55% relative BLEU and METEOR scores improvements compared to strong baselines. Human evaluations on two language pairs further validate the success of GLOSS.
Anthology ID:
2023.findings-acl.318
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5137–5151
Language:
URL:
https://aclanthology.org/2023.findings-acl.318
DOI:
10.18653/v1/2023.findings-acl.318
Bibkey:
Cite (ACL):
I-Hung Hsu, Avik Ray, Shubham Garg, Nanyun Peng, and Jing Huang. 2023. Code-Switched Text Synthesis in Unseen Language Pairs. In Findings of the Association for Computational Linguistics: ACL 2023, pages 5137–5151, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Code-Switched Text Synthesis in Unseen Language Pairs (Hsu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.318.pdf
Video:
 https://aclanthology.org/2023.findings-acl.318.mp4