Sishi Xiong
2024
Dual Prompt Tuning based Contrastive Learning for Hierarchical Text Classification
Sishi Xiong
|
Yu Zhao
|
Jie Zhang
|
Li Mengxiang
|
Zhongjiang He
|
Xuelong Li
|
Shuangyong Song
Findings of the Association for Computational Linguistics: ACL 2024
Hierarchical text classification aims at categorizing texts into a multi-tiered tree-structured hierarchy of labels. Existing methods pay more attention to capture hierarchy-aware text feature by exploiting explicit parent-child relationships, while interactions between peer labels are rarely taken into account, resulting in severe label confusion within each layer. In this work, we propose a novel Dual Prompt Tuning (DPT) method, which emphasizes identifying discrimination among peer labels by performing contrastive learning on each hierarchical layer. We design an innovative hand-crafted prompt containing slots for both positive and negative label predictions to cooperate with contrastive learning. In addition, we introduce a label hierarchy self-sensing auxiliary task to ensure cross-layer label consistency. Extensive experiments demonstrate that DPT achieves significant improvements and outperforms the current state-of-the-art methods on BGC and RCV1-V2 benchmark datasets.
Search
Co-authors
- Yu Zhao 1
- Jie Zhang 1
- Li Mengxiang 1
- Zhongjiang He 1
- Xuelong Li 1
- show all...