%0 Conference Proceedings %T Arabic Compact Language Modelling for Resource Limited Devices %A Alyafeai, Zaid %A Ahmad, Irfan %Y Habash, Nizar %Y Bouamor, Houda %Y Hajj, Hazem %Y Magdy, Walid %Y Zaghouani, Wajdi %Y Bougares, Fethi %Y Tomeh, Nadi %Y Abu Farha, Ibrahim %Y Touileb, Samia %S Proceedings of the Sixth Arabic Natural Language Processing Workshop %D 2021 %8 April %I Association for Computational Linguistics %C Kyiv, Ukraine (Virtual) %F alyafeai-ahmad-2021-arabic %X Natural language modelling has gained a lot of interest recently. The current state-of-the-art results are achieved by first training a very large language model and then fine-tuning it on multiple tasks. However, there is little work on smaller more compact language models for resource-limited devices or applications. Not to mention, how to efficiently train such models for a low-resource language like Arabic. In this paper, we investigate how such models can be trained in a compact way for Arabic. We also show how distillation and quantization can be applied to create even smaller models. Our experiments show that our largest model which is 2x smaller than the baseline can achieve better results on multiple tasks with 2x less data for pretraining. %U https://aclanthology.org/2021.wanlp-1.6 %P 53-59