Bi-Directional Recurrent Neural Ordinary Differential Equations for Social Media Text Classification

Maunika Tamire, Srinivas Anumasa, P. K. Srijith


Abstract
Classification of posts in social media such as Twitter is difficult due to the noisy and short nature of texts. Sequence classification models based on recurrent neural networks (RNN) are popular for classifying posts that are sequential in nature. RNNs assume the hidden representation dynamics to evolve in a discrete manner and do not consider the exact time of the posting. In this work, we propose to use recurrent neural ordinary differential equations (RNODE) for social media post classification which consider the time of posting and allow the computation of hidden representation to evolve in a time-sensitive continuous manner. In addition, we propose a novel model, Bi-directional RNODE (Bi-RNODE), which can consider the information flow in both the forward and backward directions of posting times to predict the post label. Our experiments demonstrate that RNODE and Bi-RNODE are effective for the problem of stance classification of rumours in social media.
Anthology ID:
2022.wit-1.3
Volume:
Proceedings of the 2nd Workshop on Deriving Insights from User-Generated Text
Month:
May
Year:
2022
Address:
(Hybrid) Dublin, Ireland, and Virtual
Editors:
Estevam Hruschka, Tom Mitchell, Dunja Mladenic, Marko Grobelnik, Nikita Bhutani
Venue:
WIT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
20–24
Language:
URL:
https://aclanthology.org/2022.wit-1.3
DOI:
10.18653/v1/2022.wit-1.3
Bibkey:
Cite (ACL):
Maunika Tamire, Srinivas Anumasa, and P. K. Srijith. 2022. Bi-Directional Recurrent Neural Ordinary Differential Equations for Social Media Text Classification. In Proceedings of the 2nd Workshop on Deriving Insights from User-Generated Text, pages 20–24, (Hybrid) Dublin, Ireland, and Virtual. Association for Computational Linguistics.
Cite (Informal):
Bi-Directional Recurrent Neural Ordinary Differential Equations for Social Media Text Classification (Tamire et al., WIT 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.wit-1.3.pdf
Video:
 https://aclanthology.org/2022.wit-1.3.mp4