Severing the Edge Between Before and After: Neural Architectures for Temporal Ordering of Events

Miguel Ballesteros, Rishita Anubhai, Shuai Wang, Nima Pourdamghani, Yogarshi Vyas, Jie Ma, Parminder Bhatia, Kathleen McKeown, Yaser Al-Onaizan


Abstract
In this paper, we propose a neural architecture and a set of training methods for ordering events by predicting temporal relations. Our proposed models receive a pair of events within a span of text as input and they identify temporal relations (Before, After, Equal, Vague) between them. Given that a key challenge with this task is the scarcity of annotated data, our models rely on either pretrained representations (i.e. RoBERTa, BERT or ELMo), transfer and multi-task learning (by leveraging complementary datasets), and self-training techniques. Experiments on the MATRES dataset of English documents establish a new state-of-the-art on this task.
Anthology ID:
2020.emnlp-main.436
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5412–5417
Language:
URL:
https://aclanthology.org/2020.emnlp-main.436
DOI:
10.18653/v1/2020.emnlp-main.436
Bibkey:
Cite (ACL):
Miguel Ballesteros, Rishita Anubhai, Shuai Wang, Nima Pourdamghani, Yogarshi Vyas, Jie Ma, Parminder Bhatia, Kathleen McKeown, and Yaser Al-Onaizan. 2020. Severing the Edge Between Before and After: Neural Architectures for Temporal Ordering of Events. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 5412–5417, Online. Association for Computational Linguistics.
Cite (Informal):
Severing the Edge Between Before and After: Neural Architectures for Temporal Ordering of Events (Ballesteros et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.436.pdf
Video:
 https://slideslive.com/38938788