ArabicTransformer: Efficient Large Arabic Language Model with Funnel Transformer and ELECTRA Objective

Sultan Alrowili, Vijay Shanker


Abstract
Pre-training Transformer-based models such as BERT and ELECTRA on a collection of Arabic corpora, demonstrated by both AraBERT and AraELECTRA, shows an impressive result on downstream tasks. However, pre-training Transformer-based language models is computationally expensive, especially for large-scale models. Recently, Funnel Transformer has addressed the sequential redundancy inside Transformer architecture by compressing the sequence of hidden states, leading to a significant reduction in the pre-training cost. This paper empirically studies the performance and efficiency of building an Arabic language model with Funnel Transformer and ELECTRA objective. We find that our model achieves state-of-the-art results on several Arabic downstream tasks despite using less computational resources compared to other BERT-based models.
Anthology ID:
2021.findings-emnlp.108
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
1255–1261
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.108
DOI:
10.18653/v1/2021.findings-emnlp.108
Bibkey:
Cite (ACL):
Sultan Alrowili and Vijay Shanker. 2021. ArabicTransformer: Efficient Large Arabic Language Model with Funnel Transformer and ELECTRA Objective. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 1255–1261, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
ArabicTransformer: Efficient Large Arabic Language Model with Funnel Transformer and ELECTRA Objective (Alrowili & Shanker, Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.108.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.108.mp4
Code
 salrowili/arabictransformer
Data
ARCDSQuADTyDiQA