PROTAUGMENT: Unsupervised diverse short-texts paraphrasing for intent detection meta-learning

Thomas Dopierre, Christophe Gravier, Wilfried Logerais


Abstract
Recent research considers few-shot intent detection as a meta-learning problem: the model is learning to learn from a consecutive set of small tasks named episodes. In this work, we propose ProtAugment, a meta-learning algorithm for short texts classification (the intent detection task). ProtAugment is a novel extension of Prototypical Networks, that limits overfitting on the bias introduced by the few-shots classification objective at each episode. It relies on diverse paraphrasing: a conditional language model is first fine-tuned for paraphrasing, and diversity is later introduced at the decoding stage at each meta-learning episode. The diverse paraphrasing is unsupervised as it is applied to unlabelled data, and then fueled to the Prototypical Network training objective as a consistency loss. ProtAugment is the state-of-the-art method for intent detection meta-learning, at no extra labeling efforts and without the need to fine-tune a conditional language model on a given application domain.
Anthology ID:
2021.acl-long.191
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2454–2466
Language:
URL:
https://aclanthology.org/2021.acl-long.191
DOI:
10.18653/v1/2021.acl-long.191
Bibkey:
Cite (ACL):
Thomas Dopierre, Christophe Gravier, and Wilfried Logerais. 2021. PROTAUGMENT: Unsupervised diverse short-texts paraphrasing for intent detection meta-learning. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 2454–2466, Online. Association for Computational Linguistics.
Cite (Informal):
PROTAUGMENT: Unsupervised diverse short-texts paraphrasing for intent detection meta-learning (Dopierre et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.191.pdf
Optional supplementary material:
 2021.acl-long.191.OptionalSupplementaryMaterial.zip
Video:
 https://aclanthology.org/2021.acl-long.191.mp4
Code
 tdopierre/ProtAugment
Data
BANKING77DialoGLUEHWU64