Generating Classical Arabic Poetry using Pre-trained Models

Nehal Elkaref, Mervat Abu-Elkheir, Maryam ElOraby, Mohamed Abdelgaber


Abstract
Poetry generation tends to be a complicated task given meter and rhyme constraints. Previous work resorted to exhaustive methods in-order to employ poetic elements. In this paper we leave pre-trained models, GPT-J and BERTShared to recognize patterns of meters and rhyme to generate classical Arabic poetry and present our findings and results on how well both models could pick up on these classical Arabic poetic elements.
Anthology ID:
2022.wanlp-1.6
Volume:
Proceedings of the Seventh Arabic Natural Language Processing Workshop (WANLP)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Houda Bouamor, Hend Al-Khalifa, Kareem Darwish, Owen Rambow, Fethi Bougares, Ahmed Abdelali, Nadi Tomeh, Salam Khalifa, Wajdi Zaghouani
Venue:
WANLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
53–62
Language:
URL:
https://aclanthology.org/2022.wanlp-1.6
DOI:
10.18653/v1/2022.wanlp-1.6
Bibkey:
Cite (ACL):
Nehal Elkaref, Mervat Abu-Elkheir, Maryam ElOraby, and Mohamed Abdelgaber. 2022. Generating Classical Arabic Poetry using Pre-trained Models. In Proceedings of the Seventh Arabic Natural Language Processing Workshop (WANLP), pages 53–62, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
Generating Classical Arabic Poetry using Pre-trained Models (Elkaref et al., WANLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.wanlp-1.6.pdf