Improve Meta-learning for Few-Shot Text Classification with All You Can Acquire from the Tasks

Xinyue Liu, Yunlong Gao, Linlin Zong, Bo Xu


Abstract
Meta-learning has emerged as a prominent technology for few-shot text classification and has achieved promising performance. However, existing methods often encounter difficulties in drawing accurate class prototypes from support set samples, primarily due to probable large intra-class differences and small inter-class differences within the task. Recent approaches attempt to incorporate external knowledge or pre-trained language models to augment data, but this requires additional resources and thus does not suit many few-shot scenarios. In this paper, we propose a novel solution to address this issue by adequately leveraging the information within the task itself. Specifically, we utilize label information to construct a task-adaptive metric space, thereby adaptively reducing the intra-class differences and magnifying the inter-class differences. We further employ the optimal transport technique to estimate class prototypes with query set samples together, mitigating the problem of inaccurate and ambiguous support set samples caused by large intra-class differences. We conduct extensive experiments on eight benchmark datasets, and our approach shows obvious advantages over state-of-the-art models across all the tasks on all the datasets. For reproducibility, all the datasets and codes are available at https://github.com/YvoGao/LAQDA.
Anthology ID:
2024.findings-emnlp.12
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2024
Month:
November
Year:
2024
Address:
Miami, Florida, USA
Editors:
Yaser Al-Onaizan, Mohit Bansal, Yun-Nung Chen
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
223–235
Language:
URL:
https://aclanthology.org/2024.findings-emnlp.12
DOI:
Bibkey:
Cite (ACL):
Xinyue Liu, Yunlong Gao, Linlin Zong, and Bo Xu. 2024. Improve Meta-learning for Few-Shot Text Classification with All You Can Acquire from the Tasks. In Findings of the Association for Computational Linguistics: EMNLP 2024, pages 223–235, Miami, Florida, USA. Association for Computational Linguistics.
Cite (Informal):
Improve Meta-learning for Few-Shot Text Classification with All You Can Acquire from the Tasks (Liu et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-emnlp.12.pdf
Software:
 2024.findings-emnlp.12.software.zip
Data:
 2024.findings-emnlp.12.data.zip