Cross-Topic Rumor Detection using Topic-Mixtures

Xiaoying Ren, Jing Jiang, Ling Min Serena Khoo, Hai Leong Chieu


Abstract
There has been much interest in rumor detection using deep learning models in recent years. A well-known limitation of deep learning models is that they tend to learn superficial patterns, which restricts their generalization ability. We find that this is also true for cross-topic rumor detection. In this paper, we propose a method inspired by the “mixture of experts” paradigm. We assume that the prediction of the rumor class label given an instance is dependent on the topic distribution of the instance. After deriving a vector representation for each topic, given an instance, we derive a “topic mixture” vector for the instance based on its topic distribution. This topic mixture is combined with the vector representation of the instance itself to make rumor predictions. Our experiments show that our proposed method can outperform two baseline debiasing methods in a cross-topic setting. In a synthetic setting when we removed topic-specific words, our method also works better than the baselines, showing that our method does not rely on superficial features.
Anthology ID:
2021.eacl-main.131
Volume:
Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume
Month:
April
Year:
2021
Address:
Online
Editors:
Paola Merlo, Jorg Tiedemann, Reut Tsarfaty
Venue:
EACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1534–1538
Language:
URL:
https://aclanthology.org/2021.eacl-main.131
DOI:
10.18653/v1/2021.eacl-main.131
Bibkey:
Cite (ACL):
Xiaoying Ren, Jing Jiang, Ling Min Serena Khoo, and Hai Leong Chieu. 2021. Cross-Topic Rumor Detection using Topic-Mixtures. In Proceedings of the 16th Conference of the European Chapter of the Association for Computational Linguistics: Main Volume, pages 1534–1538, Online. Association for Computational Linguistics.
Cite (Informal):
Cross-Topic Rumor Detection using Topic-Mixtures (Ren et al., EACL 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eacl-main.131.pdf