Embedding Time Differences in Context-sensitive Neural Networks for Learning Time to Event

Nazanin Dehghani, Hassan Hajipoor, Hadi Amiri


Abstract
We propose an effective context-sensitive neural model for time to event (TTE) prediction task, which aims to predict the amount of time to/from the occurrence of given events in streaming content. We investigate this problem in the context of a multi-task learning framework, which we enrich with time difference embeddings. In addition, we develop a multi-genre dataset of English events about soccer competitions and academy awards ceremonies, and their relevant tweets obtained from Twitter. Our model is 1.4 and 3.3 hours more accurate than the current state-of-the-art model in estimating TTE on English and Dutch tweets respectively. We examine different aspects of our model to illustrate its source of improvement.
Anthology ID:
2021.acl-short.80
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
630–636
Language:
URL:
https://aclanthology.org/2021.acl-short.80
DOI:
10.18653/v1/2021.acl-short.80
Bibkey:
Cite (ACL):
Nazanin Dehghani, Hassan Hajipoor, and Hadi Amiri. 2021. Embedding Time Differences in Context-sensitive Neural Networks for Learning Time to Event. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 2: Short Papers), pages 630–636, Online. Association for Computational Linguistics.
Cite (Informal):
Embedding Time Differences in Context-sensitive Neural Networks for Learning Time to Event (Dehghani et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-short.80.pdf
Video:
 https://aclanthology.org/2021.acl-short.80.mp4