Comparing Prompt-Based and Standard Fine-Tuning for Urdu Text Classification

Faizad Ullah, Ubaid Azam, Ali Faheem, Faisal Kamiran, Asim Karim


Abstract
Recent advancements in natural language processing have demonstrated the efficacy of pre-trained language models for various downstream tasks through prompt-based fine-tuning. In contrast to standard fine-tuning, which relies solely on labeled examples, prompt-based fine-tuning combines a few labeled examples (few shot) with guidance through prompts tailored for the specific language and task. For low-resource languages, where labeled examples are limited, prompt-based fine-tuning appears to be a promising alternative. In this paper, we compare prompt-based and standard fine-tuning for the popular task of text classification in Urdu and Roman Urdu languages. We conduct experiments using five datasets, covering different domains, and pre-trained multilingual transformers. The results reveal that significant improvement of up to 13% in accuracy is achieved by prompt-based fine-tuning over standard fine-tuning approaches. This suggests the potential of prompt-based fine-tuning as a valuable approach for low-resource languages with limited labeled data.
Anthology ID:
2023.findings-emnlp.449
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6747–6754
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.449
DOI:
10.18653/v1/2023.findings-emnlp.449
Bibkey:
Cite (ACL):
Faizad Ullah, Ubaid Azam, Ali Faheem, Faisal Kamiran, and Asim Karim. 2023. Comparing Prompt-Based and Standard Fine-Tuning for Urdu Text Classification. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 6747–6754, Singapore. Association for Computational Linguistics.
Cite (Informal):
Comparing Prompt-Based and Standard Fine-Tuning for Urdu Text Classification (Ullah et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.449.pdf