Complicate Then Simplify: A Novel Way to Explore Pre-trained Models for Text Classification

Xu Zhang, Zejie Liu, Yanzheng Xiang, Deyu Zhou


Abstract
With the development of pre-trained models (PTMs), the performance of text classification has been continuously improved by directly employing the features generated by PTMs. However such way might not fully explore the knowledge in PTMs as it is constrained by the difficulty of the task. Compared to difficult task, the learning algorithms tend to saturate early on the simple task. Moreover, the native sentence representations derived from BERT are prone to be collapsed and directly employing such representation for text classification might fail to fully capture discriminative features. In order to address these issues, in this paper we propose a novel framework for text classification which implements a two-stage training strategy. In the pre-training stage, auxiliary labels are introduced to increase the task difficulties and to fully exploit the knowledge in the pre-trained model. In the fine-tuning stage, the textual representation learned in the pre-training stage is employed and the classifier is fine-tuned to obtain better classification performance. Experiments were conducted on six text classification corpora and the results showed that the proposed framework outperformed several state-of-the-art baselines.
Anthology ID:
2022.coling-1.97
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
1136–1145
Language:
URL:
https://aclanthology.org/2022.coling-1.97
DOI:
Bibkey:
Cite (ACL):
Xu Zhang, Zejie Liu, Yanzheng Xiang, and Deyu Zhou. 2022. Complicate Then Simplify: A Novel Way to Explore Pre-trained Models for Text Classification. In Proceedings of the 29th International Conference on Computational Linguistics, pages 1136–1145, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Complicate Then Simplify: A Novel Way to Explore Pre-trained Models for Text Classification (Zhang et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.97.pdf
Data
CLUE