Modularized Syntactic Neural Networks for Sentence Classification

Haiyan Wu, Ying Liu, Shaoyun Shi


Abstract
This paper focuses on tree-based modeling for the sentence classification task. In existing works, aggregating on a syntax tree usually considers local information of sub-trees. In contrast, in addition to the local information, our proposed Modularized Syntactic Neural Network (MSNN) utilizes the syntax category labels and takes advantage of the global context while modeling sub-trees. In MSNN, each node of a syntax tree is modeled by a label-related syntax module. Each syntax module aggregates the outputs of lower-level modules, and finally, the root module provides the sentence representation. We design a tree-parallel mini-batch strategy for efficient training and predicting. Experimental results on four benchmark datasets show that our MSNN significantly outperforms previous state-of-the-art tree-based methods on the sentence classification task.
Anthology ID:
2020.emnlp-main.222
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2786–2792
Language:
URL:
https://aclanthology.org/2020.emnlp-main.222
DOI:
10.18653/v1/2020.emnlp-main.222
Bibkey:
Cite (ACL):
Haiyan Wu, Ying Liu, and Shaoyun Shi. 2020. Modularized Syntactic Neural Networks for Sentence Classification. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 2786–2792, Online. Association for Computational Linguistics.
Cite (Informal):
Modularized Syntactic Neural Networks for Sentence Classification (Wu et al., EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.222.pdf
Video:
 https://slideslive.com/38938796