A Graph Enhanced BERT Model for Event Prediction

Li Du, Xiao Ding, Yue Zhang, Ting Liu, Bing Qin


Abstract
Predicting the subsequent event for an existing event context is an important but challenging task, as it requires understanding the underlying relationship between events. Previous methods propose to retrieve relational features from event graph to enhance the modeling of event correlation. However, the sparsity of event graph may restrict the acquisition of relevant graph information, and hence influence the model performance. To address this issue, we consider automatically building of event graph using a BERT model. To this end, we incorporate an additional structured variable into BERT to learn to predict the event connections in the training process. Hence, in the test process, the connection relationship for unseen events can be predicted by the structured variable. Results on two event prediction tasks: script event prediction and story ending prediction, show that our approach can outperform state-of-the-art baseline methods.
Anthology ID:
2022.findings-acl.206
Original:
2022.findings-acl.206v1
Version 2:
2022.findings-acl.206v2
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2628–2638
Language:
URL:
https://aclanthology.org/2022.findings-acl.206
DOI:
10.18653/v1/2022.findings-acl.206
Bibkey:
Cite (ACL):
Li Du, Xiao Ding, Yue Zhang, Ting Liu, and Bing Qin. 2022. A Graph Enhanced BERT Model for Event Prediction. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2628–2638, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
A Graph Enhanced BERT Model for Event Prediction (Du et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.206.pdf
Software:
 2022.findings-acl.206.software.zip