Retrieval-style In-context Learning for Few-shot Hierarchical Text Classification

Huiyao Chen, Yu Zhao, Zulong Chen, Mengjia Wang, Liangyue Li, Meishan Zhang, Min Zhang


Abstract
Hierarchical text classification (HTC) is an important task with broad applications, and few-shot HTC has gained increasing interest recently. While in-context learning (ICL) with large language models (LLMs) has achieved significant success in few-shot learning, it is not as effective for HTC because of the expansive hierarchical label sets and extremely ambiguous labels. In this work, we introduce the first ICL-based framework with LLM for few-shot HTC. We exploit a retrieval database to identify relevant demonstrations, and an iterative policy to manage multi-layer hierarchical labels. Particularly, we equip the retrieval database with HTC label-aware representations for the input texts, which is achieved by continual training on a pretrained language model with masked language modeling (MLM), layer-wise classification (CLS, specifically for HTC), and a novel divergent contrastive learning (DCL, mainly for adjacent semantically similar labels) objective. Experimental results on three benchmark datasets demonstrate superior performance of our method, and we can achieve state-of-the-art results in few-shot HTC.
Anthology ID:
2024.tacl-1.67
Volume:
Transactions of the Association for Computational Linguistics, Volume 12
Month:
Year:
2024
Address:
Cambridge, MA
Venue:
TACL
SIG:
Publisher:
MIT Press
Note:
Pages:
1214–1231
Language:
URL:
https://aclanthology.org/2024.tacl-1.67
DOI:
10.1162/tacl_a_00697
Bibkey:
Cite (ACL):
Huiyao Chen, Yu Zhao, Zulong Chen, Mengjia Wang, Liangyue Li, Meishan Zhang, and Min Zhang. 2024. Retrieval-style In-context Learning for Few-shot Hierarchical Text Classification. Transactions of the Association for Computational Linguistics, 12:1214–1231.
Cite (Informal):
Retrieval-style In-context Learning for Few-shot Hierarchical Text Classification (Chen et al., TACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.tacl-1.67.pdf