Extracting Temporal Event Relation with Syntax-guided Graph Transformer

Shuaicheng Zhang, Qiang Ning, Lifu Huang


Abstract
Extracting temporal relations (e.g., before, after, and simultaneous) among events is crucial to natural language understanding. One of the key challenges of this problem is that when the events of interest are far away in text, the context in-between often becomes complicated, making it challenging to resolve the temporal relationship between them. This paper thus proposes a new Syntax-guided Graph Transformer network (SGT) to mitigate this issue, by (1) explicitly exploiting the connection between two events based on their dependency parsing trees, and (2) automatically locating temporal cues between two events via a novel syntax-guided attention mechanism. Experiments on two benchmark datasets, MATRES and TB-DENSE, show that our approach significantly outperforms previous state-of-the-art methods on both end-to-end temporal relation extraction and temporal relation classification with up to 7.9% absolute F-score gain; This improvement also proves to be robust on the contrast set of MATRES. We will make all the programs publicly available once the paper is accepted.
Anthology ID:
2022.findings-naacl.29
Volume:
Findings of the Association for Computational Linguistics: NAACL 2022
Month:
July
Year:
2022
Address:
Seattle, United States
Editors:
Marine Carpuat, Marie-Catherine de Marneffe, Ivan Vladimir Meza Ruiz
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
379–390
Language:
URL:
https://aclanthology.org/2022.findings-naacl.29
DOI:
10.18653/v1/2022.findings-naacl.29
Bibkey:
Cite (ACL):
Shuaicheng Zhang, Qiang Ning, and Lifu Huang. 2022. Extracting Temporal Event Relation with Syntax-guided Graph Transformer. In Findings of the Association for Computational Linguistics: NAACL 2022, pages 379–390, Seattle, United States. Association for Computational Linguistics.
Cite (Informal):
Extracting Temporal Event Relation with Syntax-guided Graph Transformer (Zhang et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-naacl.29.pdf
Video:
 https://aclanthology.org/2022.findings-naacl.29.mp4
Code
 vt-nlp/syntax-guided-graph-transformer