Weighted Neural Bag-of-n-grams Model: New Baselines for Text Classification

Bofang Li, Zhe Zhao, Tao Liu, Puwei Wang, Xiaoyong Du


Abstract
NBSVM is one of the most popular methods for text classification and has been widely used as baselines for various text representation approaches. It uses Naive Bayes (NB) feature to weight sparse bag-of-n-grams representation. N-gram captures word order in short context and NB feature assigns more weights to those important words. However, NBSVM suffers from sparsity problem and is reported to be exceeded by newly proposed distributed (dense) text representations learned by neural networks. In this paper, we transfer the n-grams and NB weighting to neural models. We train n-gram embeddings and use NB weighting to guide the neural models to focus on important words. In fact, our methods can be viewed as distributed (dense) counterparts of sparse bag-of-n-grams in NBSVM. We discover that n-grams and NB weighting are also effective in distributed representations. As a result, our models achieve new strong baselines on 9 text classification datasets, e.g. on IMDB dataset, we reach performance of 93.5% accuracy, which exceeds previous state-of-the-art results obtained by deep neural models. All source codes are publicly available at https://github.com/zhezhaoa/neural_BOW_toolkit.
Anthology ID:
C16-1150
Volume:
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Month:
December
Year:
2016
Address:
Osaka, Japan
Editors:
Yuji Matsumoto, Rashmi Prasad
Venue:
COLING
SIG:
Publisher:
The COLING 2016 Organizing Committee
Note:
Pages:
1591–1600
Language:
URL:
https://aclanthology.org/C16-1150
DOI:
Bibkey:
Cite (ACL):
Bofang Li, Zhe Zhao, Tao Liu, Puwei Wang, and Xiaoyong Du. 2016. Weighted Neural Bag-of-n-grams Model: New Baselines for Text Classification. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pages 1591–1600, Osaka, Japan. The COLING 2016 Organizing Committee.
Cite (Informal):
Weighted Neural Bag-of-n-grams Model: New Baselines for Text Classification (Li et al., COLING 2016)
Copy Citation:
PDF:
https://aclanthology.org/C16-1150.pdf
Code
 zhezhaoa/neural_BOW_toolkit