Son Thai Pham
2026
Supachoke at AbjadMed: Enhancing Arabic Medical Text Classification Using Fine-Tuned AraBERT
Thanh Phu Nguyen | Tuan Thai Huy Nguyen Cu | Son Thai Pham | Tri Duy Ho Nguyen
Proceedings of the 2nd Workshop on NLP for Languages Using Arabic Script
Thanh Phu Nguyen | Tuan Thai Huy Nguyen Cu | Son Thai Pham | Tri Duy Ho Nguyen
Proceedings of the 2nd Workshop on NLP for Languages Using Arabic Script
Medical text classification is an important task in healthcare NLP, yet Arabic medical texts remain underexplored due to linguistic complexity and limited annotated data. In this paper, we study the effectiveness of AraBERT, a pre-trained Arabic transformer model, for Arabic medical text classification. We fine-tune AraBERT on a labeled medical dataset and evaluate its performance using standard classification metrics. Experimental results show that our fine-tuned AraBERT model achieves a private leaderboard score of 0.4076 and ranks 13th among participating teams, outperforming classical machine learning baselines and other transformer variants. These findings highlight the potential of transformer-based approaches for Arabic medical NLP and motivate further research.