Weakly-Supervised Modeling of Contextualized Event Embedding for Discourse Relations

I-Ta Lee, Maria Leonor Pacheco, Dan Goldwasser


Abstract
Representing, and reasoning over, long narratives requires models that can deal with complex event structures connected through multiple relationship types. This paper suggests to represent this type of information as a narrative graph and learn contextualized event representations over it using a relational graph neural network model. We train our model to capture event relations, derived from the Penn Discourse Tree Bank, on a huge corpus, and show that our multi-relational contextualized event representation can improve performance when learning script knowledge without direct supervision and provide a better representation for the implicit discourse sense classification task.
Anthology ID:
2020.findings-emnlp.446
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2020
Month:
November
Year:
2020
Address:
Online
Editors:
Trevor Cohn, Yulan He, Yang Liu
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4962–4972
Language:
URL:
https://aclanthology.org/2020.findings-emnlp.446
DOI:
10.18653/v1/2020.findings-emnlp.446
Bibkey:
Cite (ACL):
I-Ta Lee, Maria Leonor Pacheco, and Dan Goldwasser. 2020. Weakly-Supervised Modeling of Contextualized Event Embedding for Discourse Relations. In Findings of the Association for Computational Linguistics: EMNLP 2020, pages 4962–4972, Online. Association for Computational Linguistics.
Cite (Informal):
Weakly-Supervised Modeling of Contextualized Event Embedding for Discourse Relations (Lee et al., Findings 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.findings-emnlp.446.pdf
Code
 doug919/narrative_graph_emnlp2020