ARGUABLY @ Causal News Corpus 2022: Contextually Augmented Language Models for Event Causality Identification

Guneet Kohli, Prabsimran Kaur, Jatin Bedi


Abstract
Causal (a cause-effect relationship between two arguments) has become integral to various NLP domains such as question answering, summarization, and event prediction. To understand causality in detail, Event Causality Identification with Causal News Corpus (CASE-2022) has organized shared tasks. This paper defines our participation in Subtask 1, which focuses on classifying event causality. We used sentence-level augmentation based on contextualized word embeddings of distillBERT to construct new data. This data was then trained using two approaches. The first technique used the DeBERTa language model, and the second used the RoBERTa language model in combination with cross-attention. We obtained the second-best F1 score (0.8610) in the competition with the Contextually Augmented DeBERTa model.
Anthology ID:
2022.case-1.20
Volume:
Proceedings of the 5th Workshop on Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE)
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates (Hybrid)
Editors:
Ali Hürriyetoğlu, Hristo Tanev, Vanni Zavarella, Erdem Yörük
Venue:
CASE
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
143–148
Language:
URL:
https://aclanthology.org/2022.case-1.20
DOI:
10.18653/v1/2022.case-1.20
Bibkey:
Cite (ACL):
Guneet Kohli, Prabsimran Kaur, and Jatin Bedi. 2022. ARGUABLY @ Causal News Corpus 2022: Contextually Augmented Language Models for Event Causality Identification. In Proceedings of the 5th Workshop on Challenges and Applications of Automated Extraction of Socio-political Events from Text (CASE), pages 143–148, Abu Dhabi, United Arab Emirates (Hybrid). Association for Computational Linguistics.
Cite (Informal):
ARGUABLY @ Causal News Corpus 2022: Contextually Augmented Language Models for Event Causality Identification (Kohli et al., CASE 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.case-1.20.pdf