Rumor Detection on Twitter with Claim-Guided Hierarchical Graph Attention Networks

Hongzhan Lin, Jing Ma, Mingfei Cheng, Zhiwei Yang, Liangliang Chen, Guang Chen


Abstract
Rumors are rampant in the era of social media. Conversation structures provide valuable clues to differentiate between real and fake claims. However, existing rumor detection methods are either limited to the strict relation of user responses or oversimplify the conversation structure. In this study, to substantially reinforces the interaction of user opinions while alleviating the negative impact imposed by irrelevant posts, we first represent the conversation thread as an undirected interaction graph. We then present a Claim-guided Hierarchical Graph Attention Network for rumor classification, which enhances the representation learning for responsive posts considering the entire social contexts and attends over the posts that can semantically infer the target claim. Extensive experiments on three Twitter datasets demonstrate that our rumor detection method achieves much better performance than state-of-the-art methods and exhibits a superior capacity for detecting rumors at early stages.
Anthology ID:
2021.emnlp-main.786
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10035–10047
Language:
URL:
https://aclanthology.org/2021.emnlp-main.786
DOI:
10.18653/v1/2021.emnlp-main.786
Bibkey:
Cite (ACL):
Hongzhan Lin, Jing Ma, Mingfei Cheng, Zhiwei Yang, Liangliang Chen, and Guang Chen. 2021. Rumor Detection on Twitter with Claim-Guided Hierarchical Graph Attention Networks. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 10035–10047, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Rumor Detection on Twitter with Claim-Guided Hierarchical Graph Attention Networks (Lin et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.786.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.786.mp4