Learning Clause Representation from Dependency-Anchor Graph for Connective Prediction

Yanjun Gao, Ting-Hao Huang, Rebecca J. Passonneau


Abstract
Semantic representation that supports the choice of an appropriate connective between pairs of clauses inherently addresses discourse coherence, which is important for tasks such as narrative understanding, argumentation, and discourse parsing. We propose a novel clause embedding method that applies graph learning to a data structure we refer to as a dependency-anchor graph. The dependency anchor graph incorporates two kinds of syntactic information, constituency structure, and dependency relations, to highlight the subject and verb phrase relation. This enhances coherence-related aspects of representation. We design a neural model to learn a semantic representation for clauses from graph convolution over latent representations of the subject and verb phrase. We evaluate our method on two new datasets: a subset of a large corpus where the source texts are published novels, and a new dataset collected from students’ essays. The results demonstrate a significant improvement over tree-based models, confirming the importance of emphasizing the subject and verb phrase. The performance gap between the two datasets illustrates the challenges of analyzing student’s written text, plus a potential evaluation task for coherence modeling and an application for suggesting revisions to students.
Anthology ID:
2021.textgraphs-1.6
Volume:
Proceedings of the Fifteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-15)
Month:
June
Year:
2021
Address:
Mexico City, Mexico
Editors:
Alexander Panchenko, Fragkiskos D. Malliaros, Varvara Logacheva, Abhik Jana, Dmitry Ustalov, Peter Jansen
Venue:
TextGraphs
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
54–66
Language:
URL:
https://aclanthology.org/2021.textgraphs-1.6
DOI:
10.18653/v1/2021.textgraphs-1.6
Bibkey:
Cite (ACL):
Yanjun Gao, Ting-Hao Huang, and Rebecca J. Passonneau. 2021. Learning Clause Representation from Dependency-Anchor Graph for Connective Prediction. In Proceedings of the Fifteenth Workshop on Graph-Based Methods for Natural Language Processing (TextGraphs-15), pages 54–66, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Learning Clause Representation from Dependency-Anchor Graph for Connective Prediction (Gao et al., TextGraphs 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.textgraphs-1.6.pdf
Code
 serenayj/DeSSE
Data
SentEval