ARBML: Democritizing Arabic Natural Language Processing Tools

Zaid Alyafeai, Maged Al-Shaibani


Abstract
Automating natural language understanding is a lifelong quest addressed for decades. With the help of advances in machine learning and particularly, deep learning, we are able to produce state of the art models that can imitate human interactions with languages. Unfortunately, these advances are controlled by the availability of language resources. Arabic advances in this field , although it has a great potential, are still limited. This is apparent in both research and development. In this paper, we showcase some NLP models we trained for Arabic. We also present our methodology and pipeline to build such models from data collection, data preprocessing, tokenization and model deployment. These tools help in the advancement of the field and provide a systematic approach for extending NLP tools to many languages.
Anthology ID:
2020.nlposs-1.2
Volume:
Proceedings of Second Workshop for NLP Open Source Software (NLP-OSS)
Month:
November
Year:
2020
Address:
Online
Editors:
Eunjeong L. Park, Masato Hagiwara, Dmitrijs Milajevs, Nelson F. Liu, Geeticka Chauhan, Liling Tan
Venue:
NLPOSS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8–13
Language:
URL:
https://aclanthology.org/2020.nlposs-1.2
DOI:
10.18653/v1/2020.nlposs-1.2
Bibkey:
Cite (ACL):
Zaid Alyafeai and Maged Al-Shaibani. 2020. ARBML: Democritizing Arabic Natural Language Processing Tools. In Proceedings of Second Workshop for NLP Open Source Software (NLP-OSS), pages 8–13, Online. Association for Computational Linguistics.
Cite (Informal):
ARBML: Democritizing Arabic Natural Language Processing Tools (Alyafeai & Al-Shaibani, NLPOSS 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.nlposs-1.2.pdf
Video:
 https://slideslive.com/38939739