Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning

Yutao Mou, Keqing He, Yanan Wu, Zhiyuan Zeng, Hong Xu, Huixing Jiang, Wei Wu, Weiran Xu


Abstract
Discovering Out-of-Domain(OOD) intents is essential for developing new skills in a task-oriented dialogue system. The key challenge is how to transfer prior IND knowledge to OOD clustering. Different from existing work based on shared intent representation, we propose a novel disentangled knowledge transfer method via a unified multi-head contrastive learning framework. We aim to bridge the gap between IND pre-training and OOD clustering. Experiments and analysis on two benchmark datasets show the effectiveness of our method.
Anthology ID:
2022.acl-short.6
Volume:
Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers)
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
ACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
46–53
Language:
URL:
https://aclanthology.org/2022.acl-short.6
DOI:
10.18653/v1/2022.acl-short.6
Bibkey:
Cite (ACL):
Yutao Mou, Keqing He, Yanan Wu, Zhiyuan Zeng, Hong Xu, Huixing Jiang, Wei Wu, and Weiran Xu. 2022. Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning. In Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers), pages 46–53, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Disentangled Knowledge Transfer for OOD Intent Discovery with Unified Contrastive Learning (Mou et al., ACL 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.acl-short.6.pdf
Code
 myt517/dkt