Self-training Strategies for Sentiment Analysis: An Empirical Study

Haochen Liu, Sai Rallabandi, Yijing Wu, Parag Dakle, Preethi Raghavan


Abstract
Sentiment analysis is a crucial task in natural language processing that involves identifying and extracting subjective sentiment from text. Self-training has recently emerged as an economical and efficient technique for developing sentiment analysis models by leveraging a small amount of labeled data and a large amount of unlabeled data. However, given a set of training data, how to utilize them to conduct self-training makes a significant difference in the final performance of the model. We refer to this methodology as the self-training strategy. In this paper, we present an empirical study of various self-training strategies for sentiment analysis. First, we investigate the influence of the self-training strategy and hyper-parameters on the performance of traditional small language models (SLMs) in various few-shot settings. Second, we also explore the feasibility of leveraging large language models (LLMs) to help self-training. We propose and empirically compare several self-training strategies with the intervention of LLMs. Extensive experiments are conducted on three real-world sentiment analysis datasets.
Anthology ID:
2024.findings-eacl.131
Volume:
Findings of the Association for Computational Linguistics: EACL 2024
Month:
March
Year:
2024
Address:
St. Julian’s, Malta
Editors:
Yvette Graham, Matthew Purver
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1944–1954
Language:
URL:
https://aclanthology.org/2024.findings-eacl.131
DOI:
Bibkey:
Cite (ACL):
Haochen Liu, Sai Rallabandi, Yijing Wu, Parag Dakle, and Preethi Raghavan. 2024. Self-training Strategies for Sentiment Analysis: An Empirical Study. In Findings of the Association for Computational Linguistics: EACL 2024, pages 1944–1954, St. Julian’s, Malta. Association for Computational Linguistics.
Cite (Informal):
Self-training Strategies for Sentiment Analysis: An Empirical Study (Liu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-eacl.131.pdf
Video:
 https://aclanthology.org/2024.findings-eacl.131.mp4