Language Model Based Unsupervised Dependency Parsing with Conditional Mutual Information and Grammatical Constraints

Junjie Chen, Xiangheng He, Yusuke Miyao


Abstract
Previous methods based on Large Language Models (LLM) perform unsupervised dependency parsing by maximizing bi-lexical dependence scores. However, these previous methods adopt dependence scores that are difficult to interpret. These methods cannot incorporate grammatical constraints that previous grammar-based parsing research has shown beneficial to improving parsing performance. In this work, we apply Conditional Mutual Information (CMI), an interpretable metric, to measure the bi-lexical dependence and incorporate grammatical constraints into LLM-based unsupervised parsing. We incorporate Part-Of-Speech information as a grammatical constraint at the CMI estimation stage and integrate two additional grammatical constraints at the subsequent tree decoding stage. We find that the CMI score positively correlates with syntactic dependencies and has a stronger correlation with the syntactic dependencies than baseline scores. Our experiment confirms the benefits and applicability of the proposed grammatical constraints across five languages and eight datasets. The CMI parsing model outperforms state-of-the-art LLM-based models and similarly constrained grammar-based models. Our analysis reveals that the CMI model is strong in retrieving dependency relations with rich lexical interactions but is weak in retrieving relations with sparse lexical interactions, indicating a potential limitation in CMI-based unsupervised parsing methods.
Anthology ID:
2024.naacl-long.352
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6355–6366
Language:
URL:
https://aclanthology.org/2024.naacl-long.352
DOI:
Bibkey:
Cite (ACL):
Junjie Chen, Xiangheng He, and Yusuke Miyao. 2024. Language Model Based Unsupervised Dependency Parsing with Conditional Mutual Information and Grammatical Constraints. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 6355–6366, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Language Model Based Unsupervised Dependency Parsing with Conditional Mutual Information and Grammatical Constraints (Chen et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.352.pdf
Copyright:
 2024.naacl-long.352.copyright.pdf