Unified Model for Code-Switching Speech Recognition and Language Identification Based on Concatenated Tokenizer

Kunal Dhawan, KDimating Rekesh, Boris Ginsburg


Abstract
Code-Switching (CS) multilingual Automatic Speech Recognition (ASR) models can transcribe speech containing two or more alternating languages during a conversation. This paper proposes (1) a new method for creating code-switching ASR datasets from purely monolingual data sources, and (2) a novel Concatenated Tokenizer that enables ASR models to generate language ID for each emitted text token while reusing existing monolingual tokenizers. The efficacy of these approaches for building CS ASR models is demonstrated for two language pairs, English-Hindi and English-Spanish, where we achieve new state-of-the-art results on the Miami Bangor CS evaluation corpus. In addition to competitive ASR performance, the proposed Concatenated Tokenizer models are highly effective for spoken language identification, achieving 98%+ accuracy on the out-of-distribution FLEURS dataset.
Anthology ID:
2023.calcs-1.7
Volume:
Proceedings of the 6th Workshop on Computational Approaches to Linguistic Code-Switching
Month:
December
Year:
2023
Address:
Singapore
Editors:
Genta Winata, Sudipta Kar, Marina Zhukova, Thamar Solorio, Mona Diab, Sunayana Sitaram, Monojit Choudhury, Kalika Bali
Venues:
CALCS | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
74–82
Language:
URL:
https://aclanthology.org/2023.calcs-1.7
DOI:
10.18653/v1/2023.calcs-1.7
Bibkey:
Cite (ACL):
Kunal Dhawan, KDimating Rekesh, and Boris Ginsburg. 2023. Unified Model for Code-Switching Speech Recognition and Language Identification Based on Concatenated Tokenizer. In Proceedings of the 6th Workshop on Computational Approaches to Linguistic Code-Switching, pages 74–82, Singapore. Association for Computational Linguistics.
Cite (Informal):
Unified Model for Code-Switching Speech Recognition and Language Identification Based on Concatenated Tokenizer (Dhawan et al., CALCS-WS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.calcs-1.7.pdf