Continually Detection, Rapidly React: Unseen Rumors Detection Based on Continual Prompt-Tuning

Yuhui Zuo, Wei Zhu, Guoyong GUET Cai


Abstract
Since open social platforms allow for a large and continuous flow of unverified information, rumors can emerge unexpectedly and spread quickly. However, existing rumor detection (RD) models often assume the same training and testing distributions and can not cope with the continuously changing social network environment. This paper proposed a Continual Prompt-Tuning RD (CPT-RD) framework, which avoids catastrophic forgetting (CF) of upstream tasks during sequential task learning and enables bidirectional knowledge transfer between domain tasks. Specifically, we propose the following strategies: (a) Our design explicitly decouples shared and domain-specific knowledge, thus reducing the interference among different domains during optimization; (b) Several technologies aim to transfer knowledge of upstream tasks to deal with emergencies; (c) A task-conditioned prompt-wise hypernetwork (TPHNet) is used to consolidate past domains. In addition, CPT-RD avoids CF without the necessity of a rehearsal buffer. Finally, CPT-RD is evaluated on English and Chinese RD datasets and is effective and efficient compared to prior state-of-the-art methods.
Anthology ID:
2022.coling-1.268
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
3029–3041
Language:
URL:
https://aclanthology.org/2022.coling-1.268
DOI:
Bibkey:
Cite (ACL):
Yuhui Zuo, Wei Zhu, and Guoyong GUET Cai. 2022. Continually Detection, Rapidly React: Unseen Rumors Detection Based on Continual Prompt-Tuning. In Proceedings of the 29th International Conference on Computational Linguistics, pages 3029–3041, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Continually Detection, Rapidly React: Unseen Rumors Detection Based on Continual Prompt-Tuning (Zuo et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.268.pdf