Ziqi Jin
2024
Sailor: Open Language Models for South-East Asia
Longxu Dou
|
Qian Liu
|
Guangtao Zeng
|
Jia Guo
|
Jiahui Zhou
|
Xin Mao
|
Ziqi Jin
|
Wei Lu
|
Min Lin
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing: System Demonstrations
We present Sailor, a family of open language models ranging from 0.5B to 14B parameters, tailored for South-East Asian (SEA) languages. From Qwen1.5, Sailor models accept 200B to 400B tokens during continual pre-training, primarily covering the languages of English, Chinese, Vietnamese, Thai, Indonesian, Malay, and Lao. The training leverages several techniques, including BPE dropout for improving the model robustness, aggressive data cleaning and deduplication, and small proxy models to optimize the data mixture. Experimental results on four typical tasks indicate that Sailor models demonstrate strong performance across different benchmarks, including commonsense reasoning, question answering, reading comprehension and examination. We share our insights to spark a wider interest in developing large language models for multilingual use cases.
Search
Co-authors
- Longxu Dou 1
- Qian Liu 1
- Guangtao Zeng 1
- Jia Guo 1
- Jiahui Zhou 1
- show all...