Incubating Text Classifiers Following User Instruction with Nothing but LLM

Letian Peng, Zilong Wang, Jingbo Shang


Abstract
In this paper, we aim to generate text classification data given arbitrary class definitions (i.e., user instruction), so one can train a text classifier without any human annotation or raw corpus. Recent advances in large language models (LLMs) lead to pioneer attempts to individually generate texts for each class via prompting. In this paper, we propose Incubator, the first framework that can handle complicated and even mutually dependent classes (e.g., "TED Talk given by Educator" and "Other"). Specifically, our Incubator is a fine-tuned LLM that takes the instruction of all class definitions as input, and in each inference, it can jointly generate one sample for every class. First, we tune Incubator on the instruction-to-data mappings that we obtained from classification datasets and descriptions on Hugging Face together with in-context augmentation by GPT-4. To emphasize the uniformity and diversity in generations, we refine Incubator by fine-tuning with the cluster centers of semantic textual embeddings of the generated samples. We compare Incubator on various classification tasks with strong baselines such as direct LLM-based inference and training data generation by prompt engineering. Experiments show Incubator is able to (1) outperform previous methods on traditional benchmarks, (2) take label interdependency and user preference into consideration, and (3) enable logical text mining by incubating multiple classifiers
Anthology ID:
2024.emnlp-main.220
Volume:
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3753–3766
Language:
URL:
https://aclanthology.org/2024.emnlp-main.220
DOI:
Bibkey:
Cite (ACL):
Letian Peng, Zilong Wang, and Jingbo Shang. 2024. Incubating Text Classifiers Following User Instruction with Nothing but LLM. In Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing, pages 3753–3766, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Incubating Text Classifiers Following User Instruction with Nothing but LLM (Peng et al., EMNLP 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.emnlp-main.220.pdf