Improve Discourse Dependency Parsing with Contextualized Representations

Yifei Zhou, Yansong Feng


Abstract
Previous works show that discourse analysis benefits from modeling intra- and inter-sentential levels separately, where proper representations for text units of different granularities are desired to capture both the information of the text units and their relation to the context. In this paper, we propose to take advantage of transformers to encode different contextualized representations of units of different levels to dynamically capture the information required for discourse dependency analysis on intra- and inter-sentential levels. Motivated by the observation of writing patterns shared across articles to improve discourse analysis, we propose to design sequence labeling methods to take advantage of such structural information from the context that substantially outperforms traditional direct classification methods. Experiments show that our model achieves state-of-the-art results on both English and Chinese datasets.
Anthology ID:
2022.findings-naacl.173
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2250–2261
Language:
URL:
https://aclanthology.org/2022.findings-naacl.173
DOI:
10.18653/v1/2022.findings-naacl.173
Bibkey:
Cite (ACL):
Yifei Zhou and Yansong Feng. 2022. Improve Discourse Dependency Parsing with Contextualized Representations. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 2250–2261, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Improve Discourse Dependency Parsing with Contextualized Representations (Zhou & Feng, Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.173.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.173.mp4