Renchu Guan


2021

pdf bib
Deep Attention Diffusion Graph Neural Networks for Text Classification
Yonghao Liu | Renchu Guan | Fausto Giunchiglia | Yanchun Liang | Xiaoyue Feng
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing

Text classification is a fundamental task with broad applications in natural language processing. Recently, graph neural networks (GNNs) have attracted much attention due to their powerful representation ability. However, most existing methods for text classification based on GNNs consider only one-hop neighborhoods and low-frequency information within texts, which cannot fully utilize the rich context information of documents. Moreover, these models suffer from over-smoothing issues if many graph layers are stacked. In this paper, a Deep Attention Diffusion Graph Neural Network (DADGNN) model is proposed to learn text representations, bridging the chasm of interaction difficulties between a word and its distant neighbors. Experimental results on various standard benchmark datasets demonstrate the superior performance of the present approach.