Meta-LMTC: Meta-Learning for Large-Scale Multi-Label Text Classification

Ran Wang, Xi’ao Su, Siyu Long, Xinyu Dai, Shujian Huang, Jiajun Chen


Abstract
Large-scale multi-label text classification (LMTC) tasks often face long-tailed label distributions, where many labels have few or even no training instances. Although current methods can exploit prior knowledge to handle these few/zero-shot labels, they neglect the meta-knowledge contained in the dataset that can guide models to learn with few samples. In this paper, for the first time, this problem is addressed from a meta-learning perspective. However, the simple extension of meta-learning approaches to multi-label classification is sub-optimal for LMTC tasks due to long-tailed label distribution and coexisting of few- and zero-shot scenarios. We propose a meta-learning approach named META-LMTC. Specifically, it constructs more faithful and more diverse tasks according to well-designed sampling strategies and directly incorporates the objective of adapting to new low-resource tasks into the meta-learning phase. Extensive experiments show that META-LMTC achieves state-of-the-art performance against strong baselines and can still enhance powerful BERTlike models.
Anthology ID:
2021.emnlp-main.679
Volume:
Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing
Month:
November
Year:
2021
Address:
Online and Punta Cana, Dominican Republic
Editors:
Marie-Francine Moens, Xuanjing Huang, Lucia Specia, Scott Wen-tau Yih
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8633–8646
Language:
URL:
https://aclanthology.org/2021.emnlp-main.679
DOI:
10.18653/v1/2021.emnlp-main.679
Bibkey:
Cite (ACL):
Ran Wang, Xi’ao Su, Siyu Long, Xinyu Dai, Shujian Huang, and Jiajun Chen. 2021. Meta-LMTC: Meta-Learning for Large-Scale Multi-Label Text Classification. In Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing, pages 8633–8646, Online and Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Meta-LMTC: Meta-Learning for Large-Scale Multi-Label Text Classification (Wang et al., EMNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.emnlp-main.679.pdf
Video:
 https://aclanthology.org/2021.emnlp-main.679.mp4
Data
EURLEX57K