Towards Language-Agnostic STIPA: Universal Phonetic Transcription to Support Language Documentation at Scale

Jacob Lee Suchardt, Hana El-Shazli, Pierluigi Cassotti


Abstract
This paper explores the use of existing state-of-the-art speech recognition models (ASR) for the task of generating narrow phonetic transcriptions using the International Phonetic Alphabet (STIPA). Unlike conventional ASR systems focused on orthographic output for high-resource languages, STIPA can be used as a language-agnostic interface valuable for documenting under-resourced and unwritten languages. We introduce a new dataset for South Levantine Arabic and present the first large-scale evaluation of STIPA models across 51 language families. Additionally, we provide a use case on Sanna, a severely endangered language. Our findings show that fine-tuned ASR models can produce accurate IPA transcriptions with limited supervision, significantly reducing phonetic error rates even in extremely low-resource settings. The results highlight the potential of STIPA for scalable language documentation.
Anthology ID:
2025.emnlp-main.1600
Volume:
Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2025
Address:
Suzhou, China
Editors:
Christos Christodoulopoulos, Tanmoy Chakraborty, Carolyn Rose, Violet Peng
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
31411–31427
Language:
URL:
https://aclanthology.org/2025.emnlp-main.1600/
DOI:
Bibkey:
Cite (ACL):
Jacob Lee Suchardt, Hana El-Shazli, and Pierluigi Cassotti. 2025. Towards Language-Agnostic STIPA: Universal Phonetic Transcription to Support Language Documentation at Scale. In Proceedings of the 2025 Conference on Empirical Methods in Natural Language Processing, pages 31411–31427, Suzhou, China. Association for Computational Linguistics.
Cite (Informal):
Towards Language-Agnostic STIPA: Universal Phonetic Transcription to Support Language Documentation at Scale (Suchardt et al., EMNLP 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.emnlp-main.1600.pdf
Checklist:
 2025.emnlp-main.1600.checklist.pdf