Cluster & Tune: Boost Cold Start Performance in Text Classification

Eyal Shnarch, Ariel Gera, Alon Halfon, Lena Dankin, Leshem Choshen, Ranit Aharonov, Noam Slonim


Abstract
In real-world scenarios, a text classification task often begins with a cold start, when labeled data is scarce. In such cases, the common practice of fine-tuning pre-trained models, such as BERT, for a target classification task, is prone to produce poor performance. We suggest a method to boost the performance of such models by adding an intermediate unsupervised classification task, between the pre-training and fine-tuning phases. As such an intermediate task, we perform clustering and train the pre-trained model on predicting the cluster labels. We test this hypothesis on various data sets, and show that this additional classification phase can significantly improve performance, mainly for topical classification tasks, when the number of labeled instances available for fine-tuning is only a couple of dozen to a few hundred.
Anthology ID:
2022.acl-long.526
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7639–7653
Language:
URL:
https://aclanthology.org/2022.acl-long.526
DOI:
10.18653/v1/2022.acl-long.526
Bibkey:
Cite (ACL):
Eyal Shnarch, Ariel Gera, Alon Halfon, Lena Dankin, Leshem Choshen, Ranit Aharonov, and Noam Slonim. 2022. Cluster & Tune: Boost Cold Start Performance in Text Classification. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 7639–7653, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Cluster & Tune: Boost Cold Start Performance in Text Classification (Shnarch et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.526.pdf
Video:
 https://aclanthology.org/2022.acl-long.526.mp4
Code
 ibm/intermediate-training-using-clustering
Data
Yahoo! Answers