The Effectiveness of Intermediate-Task Training for Code-Switched Natural Language Understanding

Archiki Prasad, Mohammad Ali Rehan, Shreya Pathak, Preethi Jyothi


Abstract
While recent benchmarks have spurred a lot of new work on improving the generalization of pretrained multilingual language models on multilingual tasks, techniques to improve code-switched natural language understanding tasks have been far less explored. In this work, we propose the use of bilingual intermediate pretraining as a reliable technique to derive large and consistent performance gains using code-switched text on three different NLP tasks: Natural Language Inference (NLI), Question Answering (QA) and Sentiment Analysis (SA). We show consistent performance gains on four different code-switched language-pairs (Hindi-English, Spanish-English, Tamil-English and Malayalam-English) for SA and on Hindi-English for NLI and QA. We also present a code-switched masked language modeling (MLM) pretraining technique that consistently benefits SA compared to standard MLM pretraining using real code-switched text.
Anthology ID:
2021.mrl-1.16
Volume:
Proceedings of the 1st Workshop on Multilingual Representation Learning
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Duygu Ataman, Alexandra Birch, Alexis Conneau, Orhan Firat, Sebastian Ruder, Gozde Gul Sahin
Venue:
MRL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
176–190
Language:
URL:
https://aclanthology.org/2021.mrl-1.16
DOI:
10.18653/v1/2021.mrl-1.16
Bibkey:
Cite (ACL):
Archiki Prasad, Mohammad Ali Rehan, Shreya Pathak, and Preethi Jyothi. 2021. The Effectiveness of Intermediate-Task Training for Code-Switched Natural Language Understanding. In Proceedings of the 1st Workshop on Multilingual Representation Learning, pages 176–190, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
The Effectiveness of Intermediate-Task Training for Code-Switched Natural Language Understanding (Prasad et al., MRL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.mrl-1.16.pdf
Video:
 https://aclanthology.org/2021.mrl-1.16.mp4
Data
SQuADTweetEval