LaoPLM: Pre-trained Language Models for Lao

Nankai Lin, Yingwen Fu, Chuwei Chen, Ziyu Yang, Shengyi Jiang


Abstract
Trained on the large corpus, pre-trained language models (PLMs) can capture different levels of concepts in context and hence generate universal language representations. They can benefit from multiple downstream natural language processing (NLP) tasks. Although PTMs have been widely used in most NLP applications, especially for high-resource languages such as English, it is under-represented in Lao NLP research. Previous work on Lao has been hampered by the lack of annotated datasets and the sparsity of language resources. In this work, we construct a text classification dataset to alleviate the resource-scarce situation of the Lao language. In addition, we present the first transformer-based PTMs for Lao with four versions: BERT-Small , BERT-Base , ELECTRA-Small , and ELECTRA-Base . Furthermore, we evaluate them on two downstream tasks: part-of-speech (POS) tagging and text classification. Experiments demonstrate the effectiveness of our Lao models. We release our models and datasets to the community, hoping to facilitate the future development of Lao NLP applications.
Anthology ID:
2022.lrec-1.698
Volume:
Proceedings of the Thirteenth Language Resources and Evaluation Conference
Month:
June
Year:
2022
Address:
Marseille, France
Editors:
Nicoletta Calzolari, Frédéric Béchet, Philippe Blache, Khalid Choukri, Christopher Cieri, Thierry Declerck, Sara Goggi, Hitoshi Isahara, Bente Maegaard, Joseph Mariani, Hélène Mazo, Jan Odijk, Stelios Piperidis
Venue:
LREC
SIG:
Publisher:
European Language Resources Association
Note:
Pages:
6506–6512
Language:
URL:
https://aclanthology.org/2022.lrec-1.698
DOI:
Bibkey:
Cite (ACL):
Nankai Lin, Yingwen Fu, Chuwei Chen, Ziyu Yang, and Shengyi Jiang. 2022. LaoPLM: Pre-trained Language Models for Lao. In Proceedings of the Thirteenth Language Resources and Evaluation Conference, pages 6506–6512, Marseille, France. European Language Resources Association.
Cite (Informal):
LaoPLM: Pre-trained Language Models for Lao (Lin et al., LREC 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.lrec-1.698.pdf
Data
CC100