Code-Switching Metrics Using Intonation Units

Rebecca Pattichis, Dora LaCasse, Sonya Trawick, Rena Cacoullos


Abstract
Code-switching (CS) metrics in NLP that are based on word-level units are misaligned with true bilingual CS behavior. Crucially, CS is not equally likely between any two words, but follows syntactic and prosodic rules. We adapt two metrics, multilinguality and CS probability, and apply them to transcribed bilingual speech, for the first time putting forward Intonation Units (IUs) – prosodic speech segments – as basic tokens for NLP tasks. In addition, we calculate these two metrics separately for distinct mixing types: alternating-language multi-word strings and single-word incorporations from one language into another. Results indicate that individual differences according to the two CS metrics are independent. However, there is a shared tendency among bilinguals for multi-word CS to occur across, rather than within, IU boundaries. That is, bilinguals tend to prosodically separate their two languages. This constraint is blurred when metric calculations do not distinguish multi-word and single-word items. These results call for a reconsideration of units of analysis in future development of CS datasets for NLP tasks.
Anthology ID:
2023.emnlp-main.1047
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
16840–16849
Language:
URL:
https://aclanthology.org/2023.emnlp-main.1047
DOI:
10.18653/v1/2023.emnlp-main.1047
Bibkey:
Cite (ACL):
Rebecca Pattichis, Dora LaCasse, Sonya Trawick, and Rena Cacoullos. 2023. Code-Switching Metrics Using Intonation Units. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 16840–16849, Singapore. Association for Computational Linguistics.
Cite (Informal):
Code-Switching Metrics Using Intonation Units (Pattichis et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.1047.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.1047.mp4