Semi-supervised New Event Type Induction and Description via Contrastive Loss-Enforced Batch Attention

Carl Edwards, Heng Ji


Abstract
Most event extraction methods have traditionally relied on an annotated set of event types. However, creating event ontologies and annotating supervised training data are expensive and time-consuming. Previous work has proposed semi-supervised approaches which leverage seen (annotated) types to learn how to automatically discover new event types. State-of-the-art methods, both semi-supervised or fully unsupervised, use a form of reconstruction loss on specific tokens in a context. In contrast, we present a novel approach to semi-supervised new event type induction using a masked contrastive loss, which learns similarities between event mentions by enforcing an attention mechanism over the data minibatch. We further disentangle the discovered clusters by approximating the underlying manifolds in the data, which allows us to achieve an adjusted rand index score of 48.85%. Building on these clustering results, we extend our approach to two new tasks: predicting the type name of the discovered clusters and linking them to FrameNet frames.
Anthology ID:
2023.eacl-main.275
Volume:
Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics
Month:
May
Year:
2023
Address:
Dubrovnik, Croatia
Editors:
Andreas Vlachos, Isabelle Augenstein
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3805–3827
Language:
URL:
https://aclanthology.org/2023.eacl-main.275
DOI:
10.18653/v1/2023.eacl-main.275
Bibkey:
Cite (ACL):
Carl Edwards and Heng Ji. 2023. Semi-supervised New Event Type Induction and Description via Contrastive Loss-Enforced Batch Attention. In Proceedings of the 17th Conference of the European Chapter of the Association for Computational Linguistics, pages 3805–3827, Dubrovnik, Croatia. Association for Computational Linguistics.
Cite (Informal):
Semi-supervised New Event Type Induction and Description via Contrastive Loss-Enforced Batch Attention (Edwards & Ji, EACL 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.eacl-main.275.pdf
Video:
 https://aclanthology.org/2023.eacl-main.275.mp4