One-Model-Connects-All: A Unified Graph Pre-Training Model for Online Community Modeling

Ruoxue Ma, Jiarong Xu, Xinnong Zhang, Haozhe Zhang, Zuyu Zhao, Qi Zhang, Xuanjing Huang, Zhongyu Wei


Abstract
Online community is composed of communities, users, and user-generated textual content, with rich information that can help us solve social problems. Previous research hasn’t fully utilized these three components and the relationship among them. What’s more, they can’t adapt to a wide range of downstream tasks. To solve these problems, we focus on a framework that simultaneously considers communities, users, and texts. And it can easily connect with a variety of downstream tasks related to social media. Specifically, we use a ternary heterogeneous graph to model online communities. Text reconstruction and edge generation are used to learn structural and semantic knowledge among communities, users, and texts. By leveraging this pre-trained model, we achieve promising results across multiple downstream tasks, such as violation detection, sentiment analysis, and community recommendation. Our exploration will improve online community modeling.
Anthology ID:
2023.findings-emnlp.1003
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
15034–15045
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.1003
DOI:
10.18653/v1/2023.findings-emnlp.1003
Bibkey:
Cite (ACL):
Ruoxue Ma, Jiarong Xu, Xinnong Zhang, Haozhe Zhang, Zuyu Zhao, Qi Zhang, Xuanjing Huang, and Zhongyu Wei. 2023. One-Model-Connects-All: A Unified Graph Pre-Training Model for Online Community Modeling. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 15034–15045, Singapore. Association for Computational Linguistics.
Cite (Informal):
One-Model-Connects-All: A Unified Graph Pre-Training Model for Online Community Modeling (Ma et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.1003.pdf