Dual Prompt Tuning based Contrastive Learning for Hierarchical Text Classification

Sishi Xiong, Yu Zhao, Jie Zhang, Li Mengxiang, Zhongjiang He, Xuelong Li, Shuangyong Song


Abstract
Hierarchical text classification aims at categorizing texts into a multi-tiered tree-structured hierarchy of labels. Existing methods pay more attention to capture hierarchy-aware text feature by exploiting explicit parent-child relationships, while interactions between peer labels are rarely taken into account, resulting in severe label confusion within each layer. In this work, we propose a novel Dual Prompt Tuning (DPT) method, which emphasizes identifying discrimination among peer labels by performing contrastive learning on each hierarchical layer. We design an innovative hand-crafted prompt containing slots for both positive and negative label predictions to cooperate with contrastive learning. In addition, we introduce a label hierarchy self-sensing auxiliary task to ensure cross-layer label consistency. Extensive experiments demonstrate that DPT achieves significant improvements and outperforms the current state-of-the-art methods on BGC and RCV1-V2 benchmark datasets.
Anthology ID:
2024.findings-acl.723
Volume:
Findings of the Association for Computational Linguistics ACL 2024
Month:
August
Year:
2024
Address:
Bangkok, Thailand and virtual meeting
Editors:
Lun-Wei Ku, Andre Martins, Vivek Srikumar
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12146–12158
Language:
URL:
https://aclanthology.org/2024.findings-acl.723
DOI:
Bibkey:
Cite (ACL):
Sishi Xiong, Yu Zhao, Jie Zhang, Li Mengxiang, Zhongjiang He, Xuelong Li, and Shuangyong Song. 2024. Dual Prompt Tuning based Contrastive Learning for Hierarchical Text Classification. In Findings of the Association for Computational Linguistics ACL 2024, pages 12146–12158, Bangkok, Thailand and virtual meeting. Association for Computational Linguistics.
Cite (Informal):
Dual Prompt Tuning based Contrastive Learning for Hierarchical Text Classification (Xiong et al., Findings 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.findings-acl.723.pdf