WER We Stand: Benchmarking Urdu ASR Models

Samee Arif, Aamina Jamal Khan, Mustafa Abbas, Agha Ali Raza, Awais Athar


Abstract
This paper presents a comprehensive evaluation of Urdu Automatic Speech Recognition (ASR) models. We analyze the performance of three ASR model families: Whisper, MMS, and Seamless-M4T using Word Error Rate (WER), along with a detailed examination of the most frequent wrong words and error types including insertions, deletions, and substitutions. Our analysis is conducted using two types of datasets, read speech and conversational speech. Notably, we present the first conversational speech dataset designed for benchmarking Urdu ASR models. We find that seamless-large outperforms other ASR models on the read speech dataset, while whisper-large performs best on the conversational speech dataset. Furthermore, this evaluation highlights the complexities of assessing ASR models for low-resource languages like Urdu using quantitative metrics alone and emphasizes the need for a robust Urdu text normalization system. Our findings contribute valuable insights for developing robust ASR systems for low-resource languages like Urdu.
Anthology ID:
2025.coling-main.397
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5952–5961
Language:
URL:
https://aclanthology.org/2025.coling-main.397/
DOI:
Bibkey:
Cite (ACL):
Samee Arif, Aamina Jamal Khan, Mustafa Abbas, Agha Ali Raza, and Awais Athar. 2025. WER We Stand: Benchmarking Urdu ASR Models. In Proceedings of the 31st International Conference on Computational Linguistics, pages 5952–5961, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
WER We Stand: Benchmarking Urdu ASR Models (Arif et al., COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.397.pdf