%0 Conference Proceedings %T PROTAUGMENT: Unsupervised diverse short-texts paraphrasing for intent detection meta-learning %A Dopierre, Thomas %A Gravier, Christophe %A Logerais, Wilfried %Y Zong, Chengqing %Y Xia, Fei %Y Li, Wenjie %Y Navigli, Roberto %S Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers) %D 2021 %8 August %I Association for Computational Linguistics %C Online %F dopierre-etal-2021-protaugment %X Recent research considers few-shot intent detection as a meta-learning problem: the model is learning to learn from a consecutive set of small tasks named episodes. In this work, we propose ProtAugment, a meta-learning algorithm for short texts classification (the intent detection task). ProtAugment is a novel extension of Prototypical Networks, that limits overfitting on the bias introduced by the few-shots classification objective at each episode. It relies on diverse paraphrasing: a conditional language model is first fine-tuned for paraphrasing, and diversity is later introduced at the decoding stage at each meta-learning episode. The diverse paraphrasing is unsupervised as it is applied to unlabelled data, and then fueled to the Prototypical Network training objective as a consistency loss. ProtAugment is the state-of-the-art method for intent detection meta-learning, at no extra labeling efforts and without the need to fine-tune a conditional language model on a given application domain. %R 10.18653/v1/2021.acl-long.191 %U https://aclanthology.org/2021.acl-long.191 %U https://doi.org/10.18653/v1/2021.acl-long.191 %P 2454-2466