Language Models at the Syntax-Semantics Interface: A Case Study of the Long-Distance Binding of Chinese Reflexive Ziji

Xiulin Yang


Abstract
This paper explores whether language models can effectively resolve the complex binding patterns of the Mandarin Chinese reflexive ziji, which are constrained by both syntactic and semantic factors. We construct a dataset of 320 synthetic sentences using templates and examples from syntactic literature, along with 360 natural sentences from the BCC corpus. Evaluating 21 language models against this dataset and comparing their performance to judgments from native Mandarin speakers, we find that none of the models consistently replicates human-like judgments. The results indicate that existing language models tend to rely heavily on sequential cues, though not always favoring the closest strings, and often overlooking subtle semantic and syntactic constraints. They tend to be more sensitive to noun-related than verb-related semantics.
Anthology ID:
2025.coling-main.257
Volume:
Proceedings of the 31st International Conference on Computational Linguistics
Month:
January
Year:
2025
Address:
Abu Dhabi, UAE
Editors:
Owen Rambow, Leo Wanner, Marianna Apidianaki, Hend Al-Khalifa, Barbara Di Eugenio, Steven Schockaert
Venue:
COLING
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3808–3824
Language:
URL:
https://aclanthology.org/2025.coling-main.257/
DOI:
Bibkey:
Cite (ACL):
Xiulin Yang. 2025. Language Models at the Syntax-Semantics Interface: A Case Study of the Long-Distance Binding of Chinese Reflexive Ziji. In Proceedings of the 31st International Conference on Computational Linguistics, pages 3808–3824, Abu Dhabi, UAE. Association for Computational Linguistics.
Cite (Informal):
Language Models at the Syntax-Semantics Interface: A Case Study of the Long-Distance Binding of Chinese Reflexive Ziji (Yang, COLING 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.coling-main.257.pdf