Cross-lingual Fine-tuning for Abstractive Arabic Text Summarization

Mram Kahla, Zijian Győző Yang, Attila Novák


Abstract
While abstractive summarization in certain languages, like English, has already reached fairly good results due to the availability of trend-setting resources, like the CNN/Daily Mail dataset, and considerable progress in generative neural models, progress in abstractive summarization for Arabic, the fifth most-spoken language globally, is still in baby shoes. While some resources for extractive summarization have been available for some time, in this paper, we present the first corpus of human-written abstractive news summaries in Arabic, hoping to lay the foundation of this line of research for this important language. The dataset consists of more than 21 thousand items. We used this dataset to train a set of neural abstractive summarization systems for Arabic by fine-tuning pre-trained language models such as multilingual BERT, AraBERT, and multilingual BART-50. As the Arabic dataset is much smaller than e.g. the CNN/Daily Mail dataset, we also applied cross-lingual knowledge transfer to significantly improve the performance of our baseline systems. The setups included two M-BERT-based summarization models originally trained for Hungarian/English and a similar system based on M-BART-50 originally trained for Russian that were further fine-tuned for Arabic. Evaluation of the models was performed in terms of ROUGE, and a manual evaluation of fluency and adequacy of the models was also performed.
Anthology ID:
2021.ranlp-1.74
Volume:
Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021)
Month:
September
Year:
2021
Address:
Held Online
Editors:
Ruslan Mitkov, Galia Angelova
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd.
Note:
Pages:
655–663
Language:
URL:
https://aclanthology.org/2021.ranlp-1.74
DOI:
Bibkey:
Cite (ACL):
Mram Kahla, Zijian Győző Yang, and Attila Novák. 2021. Cross-lingual Fine-tuning for Abstractive Arabic Text Summarization. In Proceedings of the International Conference on Recent Advances in Natural Language Processing (RANLP 2021), pages 655–663, Held Online. INCOMA Ltd..
Cite (Informal):
Cross-lingual Fine-tuning for Abstractive Arabic Text Summarization (Kahla et al., RANLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.ranlp-1.74.pdf