Pretraining Language Models with Text-Attributed Heterogeneous Graphs

Tao Zou, Le Yu, Yifei Huang, Leilei Sun, Bowen Du


Abstract
In many real-world scenarios (e.g., academic networks, social platforms), different types of entities are not only associated with texts but also connected by various relationships, which can be abstracted as Text-Attributed Heterogeneous Graphs (TAHGs). Current pretraining tasks for Language Models (LMs) primarily focus on separately learning the textual information of each entity and overlook the crucial aspect of capturing topological connections among entities in TAHGs. In this paper, we present a new pretraining framework for LMs that explicitly considers the topological and heterogeneous information in TAHGs. Firstly, we define a context graph as neighborhoods of a target node within specific orders and propose a topology-aware pretraining task to predict nodes involved in the context graph by jointly optimizing an LM and an auxiliary heterogeneous graph neural network. Secondly, based on the observation that some nodes are text-rich while others have little text, we devise a text augmentation strategy to enrich textless nodes with their neighbors’ texts for handling the imbalance issue. We conduct link prediction and node classification tasks on three datasets from various domains. Experimental results demonstrate the superiority of our approach over existing methods and the rationality of each design. Our code is available at https://github.com/Hope-Rita/THLM.
Anthology ID:
2023.findings-emnlp.692
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10316–10333
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.692
DOI:
10.18653/v1/2023.findings-emnlp.692
Bibkey:
Cite (ACL):
Tao Zou, Le Yu, Yifei Huang, Leilei Sun, and Bowen Du. 2023. Pretraining Language Models with Text-Attributed Heterogeneous Graphs. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 10316–10333, Singapore. Association for Computational Linguistics.
Cite (Informal):
Pretraining Language Models with Text-Attributed Heterogeneous Graphs (Zou et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.692.pdf