Improving Cascade Decoding with Syntax-aware Aggregator and Contrastive Learning for Event Extraction

Sheng Zeyu, Liang Yuanyuan, Lan Yunshi


Abstract
“Cascade decoding framework has shown superior performance on event extraction tasks. How-ever, it treats a sentence as a sequence and neglects the potential benefits of the syntactic struc-ture of sentences. In this paper, we improve cascade decoding with a novel module and a self-supervised task. Specifically, we propose a syntax-aware aggregator module to model the syntaxof a sentence based on cascade decoding framework such that it captures event dependencies aswell as syntactic information. Moreover, we design a type discrimination task to learn better syn-tactic representations of different event types, which could further boost the performance of eventextraction. Experimental results on two widely used event extraction datasets demonstrate thatour method could improve the original cascade decoding framework by up to 2.2% percentagepoints of F1 score and outperform a number of competitive baseline methods. Introduction”
Anthology ID:
2023.ccl-1.64
Volume:
Proceedings of the 22nd Chinese National Conference on Computational Linguistics
Month:
August
Year:
2023
Address:
Harbin, China
Editors:
Maosong Sun, Bing Qin, Xipeng Qiu, Jing Jiang, Xianpei Han
Venue:
CCL
SIG:
Publisher:
Chinese Information Processing Society of China
Note:
Pages:
748–760
Language:
English
URL:
https://aclanthology.org/2023.ccl-1.64
DOI:
Bibkey:
Cite (ACL):
Sheng Zeyu, Liang Yuanyuan, and Lan Yunshi. 2023. Improving Cascade Decoding with Syntax-aware Aggregator and Contrastive Learning for Event Extraction. In Proceedings of the 22nd Chinese National Conference on Computational Linguistics, pages 748–760, Harbin, China. Chinese Information Processing Society of China.
Cite (Informal):
Improving Cascade Decoding with Syntax-aware Aggregator and Contrastive Learning for Event Extraction (Zeyu et al., CCL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.ccl-1.64.pdf