Quantifying Discourse Support for Omitted Pronouns

Shulin Zhang, Jixing Li, John Hale


Abstract
Pro-drop is commonly seen in many languages, but its discourse motivations have not been well characterized. Inspired by the topic chain theory in Chinese, this study shows how character-verb usage continuity distinguishes dropped pronouns from overt references to story characters. We model the choice to drop vs. not drop as a function of character-verb continuity. The results show that omitted subjects have higher character history-current verb continuity salience than non-omitted subjects. This is consistent with the idea that discourse coherence with a particular topic, such as a story character, indeed facilitates the omission of pronouns in languages and contexts where they are optional.
Anthology ID:
2022.crac-1.1
Volume:
Proceedings of the Fifth Workshop on Computational Models of Reference, Anaphora and Coreference
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Maciej Ogrodniczuk, Sameer Pradhan, Anna Nedoluzhko, Vincent Ng, Massimo Poesio
Venue:
CRAC
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1–12
Language:
URL:
https://aclanthology.org/2022.crac-1.1
DOI:
Bibkey:
Cite (ACL):
Shulin Zhang, Jixing Li, and John Hale. 2022. Quantifying Discourse Support for Omitted Pronouns. In Proceedings of the Fifth Workshop on Computational Models of Reference, Anaphora and Coreference, pages 1–12, Gyeongju, Republic of Korea. Association for Computational Linguistics.
Cite (Informal):
Quantifying Discourse Support for Omitted Pronouns (Zhang et al., CRAC 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.crac-1.1.pdf