VIBE: Topic-Driven Temporal Adaptation for Twitter Classification

Yuji Zhang, Jing Li, Wenjie Li


Abstract
Language features are evolving in real-world social media, resulting in the deteriorating performance of text classification in dynamics. To address this challenge, we study temporal adaptation, where models trained on past data are tested in the future. Most prior work focused on continued pretraining or knowledge updating, which may compromise their performance on noisy social media data. To tackle this issue, we reflect feature change via modeling latent topic evolution and propose a novel model, VIBE: Variational Information Bottleneck for Evolutions. Concretely, we first employ two Information Bottleneck (IB) regularizers to distinguish past and future topics. Then, the distinguished topics work as adaptive features via multi-task training with timestamp and class label prediction. In adaptive learning, VIBE utilizes retrieved unlabeled data from online streams created posterior to training data time. Substantial Twitter experiments on three classification tasks show that our model, with only 3% of data, significantly outperforms previous state-of-the-art continued-pretraining methods.
Anthology ID:
2023.emnlp-main.203
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3340–3354
Language:
URL:
https://aclanthology.org/2023.emnlp-main.203
DOI:
10.18653/v1/2023.emnlp-main.203
Bibkey:
Cite (ACL):
Yuji Zhang, Jing Li, and Wenjie Li. 2023. VIBE: Topic-Driven Temporal Adaptation for Twitter Classification. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 3340–3354, Singapore. Association for Computational Linguistics.
Cite (Informal):
VIBE: Topic-Driven Temporal Adaptation for Twitter Classification (Zhang et al., EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.203.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.203.mp4