Disentangling Task Relations for Few-shot Text Classification via Self-Supervised Hierarchical Task Clustering

Juan Zha, Zheng Li, Ying Wei, Yu Zhang


Abstract
Few-Shot Text Classification (FSTC) imitates humans to learn a new text classifier efficiently with only few examples, by leveraging prior knowledge from historical tasks. However, most prior works assume that all the tasks are sampled from a single data source, which cannot adapt to real-world scenarios where tasks are heterogeneous and lie in different distributions. As such, existing methods may suffer from their globally knowledge-shared mechanisms to handle the task heterogeneity. On the other hand, inherent task relationships are not explicitly captured, making task knowledge unorganized and hard to transfer to new tasks. Thus, we explore a new FSTC setting where tasks can come from a diverse range of data sources. To address the task heterogeneity, we propose a self-supervised hierarchical task clustering (SS-HTC) method. SS-HTC not only customizes the cluster-specific knowledge by dynamically organizing heterogeneous tasks into different clusters in hierarchical levels but also disentangles the underlying relations between tasks to improve the interpretability. Empirically, extensive experiments on five public FSTC benchmark datasets demonstrate the effectiveness of SS-HTC.
Anthology ID:
2022.findings-emnlp.383
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2022
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5236–5247
Language:
URL:
https://aclanthology.org/2022.findings-emnlp.383
DOI:
10.18653/v1/2022.findings-emnlp.383
Bibkey:
Cite (ACL):
Juan Zha, Zheng Li, Ying Wei, and Yu Zhang. 2022. Disentangling Task Relations for Few-shot Text Classification via Self-Supervised Hierarchical Task Clustering. In Findings of the Association for Computational Linguistics: EMNLP 2022, pages 5236–5247, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Disentangling Task Relations for Few-shot Text Classification via Self-Supervised Hierarchical Task Clustering (Zha et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-emnlp.383.pdf