Learning Dynamic Representations for Discourse Dependency Parsing

Tianyi Liu, Yansong Feng, Dongyan Zhao


Abstract
Transition systems have been widely used for the discourse dependency parsing task. Existing works often characterize transition states by examining a certain number of elementary discourse units (EDUs), while neglecting the arcs obtained from the transition history. In this paper, we propose to employ GAT-based encoder to learn dynamic representations for sub-trees constructed in previous transition steps. By incorporating these representations, our model is able to retain accessibility to all parsed EDUs through the obtained arcs, thus better utilizing the structural information of the document, particularly when handling lengthy text spans with complex structures. For the discourse relation recognition task, we employ edge-featured GATs to derive better representations for EDU pairs. Experimental results show that our model can achieve state-of-the-art performance on widely adopted datasets including RST-DT, SciDTB and CDTB. Our code is available at https://github.com/lty-lty/Discourse-Dependency-Parsing.
Anthology ID:
2023.findings-emnlp.951
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
14253–14263
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.951
DOI:
10.18653/v1/2023.findings-emnlp.951
Bibkey:
Cite (ACL):
Tianyi Liu, Yansong Feng, and Dongyan Zhao. 2023. Learning Dynamic Representations for Discourse Dependency Parsing. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 14253–14263, Singapore. Association for Computational Linguistics.
Cite (Informal):
Learning Dynamic Representations for Discourse Dependency Parsing (Liu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.951.pdf