Improving Meta-learning for Low-resource Text Classification and Generation via Memory Imitation

Yingxiu Zhao, Zhiliang Tian, Huaxiu Yao, Yinhe Zheng, Dongkyu Lee, Yiping Song, Jian Sun, Nevin Zhang


Abstract
Building models of natural language processing (NLP) is challenging in low-resource scenarios where limited data are available. Optimization-based meta-learning algorithms achieve promising results in low-resource scenarios by adapting a well-generalized model initialization to handle new tasks. Nonetheless, these approaches suffer from the memorization overfitting issue, where the model tends to memorize the meta-training tasks while ignoring support sets when adapting to new tasks. To address this issue, we propose a memory imitation meta-learning (MemIML) method that enhances the model’s reliance on support sets for task adaptation. Specifically, we introduce a task-specific memory module to store support set information and construct an imitation module to force query sets to imitate the behaviors of support sets stored in the memory. A theoretical analysis is provided to prove the effectiveness of our method, and empirical results also demonstrate that our method outperforms competitive baselines on both text classification and generation tasks.
Anthology ID:
2022.acl-long.44
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
583–595
Language:
URL:
https://aclanthology.org/2022.acl-long.44
DOI:
10.18653/v1/2022.acl-long.44
Bibkey:
Cite (ACL):
Yingxiu Zhao, Zhiliang Tian, Huaxiu Yao, Yinhe Zheng, Dongkyu Lee, Yiping Song, Jian Sun, and Nevin Zhang. 2022. Improving Meta-learning for Low-resource Text Classification and Generation via Memory Imitation. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers), pages 583–595, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Improving Meta-learning for Low-resource Text Classification and Generation via Memory Imitation (Zhao et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-long.44.pdf
Software:
 2022.acl-long.44.software.zip