Self-Attention Graph Residual Convolutional Networks for Event Detection with dependency relations

Anan Liu, Ning Xu, Haozhe Liu


Abstract
Event detection (ED) task aims to classify events by identifying key event trigger words embedded in a piece of text. Previous research have proved the validity of fusing syntactic dependency relations into Graph Convolutional Networks(GCN). While existing GCN-based methods explore latent node-to-node dependency relations according to a stationary adjacency tensor, an attention-based dynamic tensor, which can pay much attention to the key node like event trigger or its neighboring nodes, has not been developed. Simultaneously, suffering from the phenomenon of graph information vanishing caused by the symmetric adjacency tensor, existing GCN models can not achieve higher overall performance. In this paper, we propose a novel model Self-Attention Graph Residual Convolution Networks (SA-GRCN) to mine node-to-node latent dependency relations via self-attention mechanism and introduce Graph Residual Network (GResNet) to solve graph information vanishing problem. Specifically, a self-attention module is constructed to generate an attention tensor, representing the dependency attention scores of all words in the sentence. Furthermore, a graph residual term is added to the baseline SA-GCN to construct a GResNet. Considering the syntactically connection of the network input, we initialize the raw adjacency tensor without processed by the self-attention module as the residual term. We conduct experiments on the ACE2005 dataset and the results show significant improvement over competitive baseline methods.
Anthology ID:
2021.findings-emnlp.28
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2021
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
Findings
SIG:
SIGDAT
Publisher:
Association for Computational Linguistics
Note:
Pages:
302–311
Language:
URL:
https://aclanthology.org/2021.findings-emnlp.28
DOI:
10.18653/v1/2021.findings-emnlp.28
Bibkey:
Cite (ACL):
Anan Liu, Ning Xu, and Haozhe Liu. 2021. Self-Attention Graph Residual Convolutional Networks for Event Detection with dependency relations. In Findings of the Association for Computational Linguistics: EMNLP 2021, pages 302–311, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Self-Attention Graph Residual Convolutional Networks for Event Detection with dependency relations (Liu et al., Findings 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.findings-emnlp.28.pdf
Video:
 https://aclanthology.org/2021.findings-emnlp.28.mp4