Semi-supervised Meta-learning for Cross-domain Few-shot Intent Classification

Yue Li, Jiong Zhang


Abstract
Meta learning aims to optimize the model’s capability to generalize to new tasks and domains. Lacking a data-efficient way to create meta training tasks has prevented the application of meta-learning to the real-world few shot learning scenarios. Recent studies have proposed unsupervised approaches to create meta-training tasks from unlabeled data for free, e.g., the SMLMT method (Bansal et al., 2020a) constructs unsupervised multi-class classification tasks from the unlabeled text by randomly masking words in the sentence and let the meta learner choose which word to fill in the blank. This study proposes a semi-supervised meta-learning approach that incorporates both the representation power of large pre-trained language models and the generalization capability of prototypical networks enhanced by SMLMT. The semi-supervised meta training approach avoids overfitting prototypical networks on a small number of labeled training examples and quickly learns cross-domain task-specific representation only from a few supporting examples. By incorporating SMLMT with prototypical networks, the meta learner generalizes better to unseen domains and gains higher accuracy on out-of-scope examples without the heavy lifting of pre-training. We observe significant improvement in few-shot generalization after training only a few epochs on the intent classification tasks evaluated in a multi-domain setting.
Anthology ID:
2021.metanlp-1.8
Volume:
Proceedings of the 1st Workshop on Meta Learning and Its Applications to Natural Language Processing
Month:
August
Year:
2021
Address:
Online
Editors:
Hung-Yi Lee, Mitra Mohtarami, Shang-Wen Li, Di Jin, Mandy Korpusik, Shuyan Dong, Ngoc Thang Vu, Dilek Hakkani-Tur
Venue:
MetaNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
67–75
Language:
URL:
https://aclanthology.org/2021.metanlp-1.8
DOI:
10.18653/v1/2021.metanlp-1.8
Bibkey:
Cite (ACL):
Yue Li and Jiong Zhang. 2021. Semi-supervised Meta-learning for Cross-domain Few-shot Intent Classification. In Proceedings of the 1st Workshop on Meta Learning and Its Applications to Natural Language Processing, pages 67–75, Online. Association for Computational Linguistics.
Cite (Informal):
Semi-supervised Meta-learning for Cross-domain Few-shot Intent Classification (Li & Zhang, MetaNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.metanlp-1.8.pdf
Video:
 https://aclanthology.org/2021.metanlp-1.8.mp4
Data
CLINC150