BRCC and SentiBahasaRojak: The First Bahasa Rojak Corpus for Pretraining and Sentiment Analysis Dataset

Nanda Putri Romadhona, Sin-En Lu, Bo-Han Lu, Richard Tzong-Han Tsai


Abstract
Code-mixing refers to the mixed use of multiple languages. It is prevalent in multilingual societies and is also one of the most challenging natural language processing tasks. In this paper, we study Bahasa Rojak, a dialect popular in Malaysia that consists of English, Malay, and Chinese. Aiming to establish a model to deal with the code-mixing phenomena of Bahasa Rojak, we use data augmentation to automatically construct the first Bahasa Rojak corpus for pre-training language models, which we name the Bahasa Rojak Crawled Corpus (BRCC). We also develop a new pre-trained model called “Mixed XLM”. The model can tag the language of the input token automatically to process code-mixing input. Finally, to test the effectiveness of the Mixed XLM model pre-trained on BRCC for social media scenarios where code-mixing is found frequently, we compile a new Bahasa Rojak sentiment analysis dataset, SentiBahasaRojak, with a Kappa value of 0.77.
Anthology ID:
2022.coling-1.389
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4418–4428
Language:
URL:
https://aclanthology.org/2022.coling-1.389
DOI:
Bibkey:
Cite (ACL):
Nanda Putri Romadhona, Sin-En Lu, Bo-Han Lu, and Richard Tzong-Han Tsai. 2022. BRCC and SentiBahasaRojak: The First Bahasa Rojak Corpus for Pretraining and Sentiment Analysis Dataset. In Proceedings of the 29th International Conference on Computational Linguistics, pages 4418–4428, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
BRCC and SentiBahasaRojak: The First Bahasa Rojak Corpus for Pretraining and Sentiment Analysis Dataset (Romadhona et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.389.pdf