Task-guided Disentangled Tuning for Pretrained Language Models

Jiali Zeng, Yufan Jiang, Shuangzhi Wu, Yongjing Yin, Mu Li


Abstract
Pretrained language models (PLMs) trained on large-scale unlabeled corpus are typically fine-tuned on task-specific downstream datasets, which have produced state-of-the-art results on various NLP tasks. However, the data discrepancy issue in domain and scale makes fine-tuning fail to efficiently capture task-specific patterns, especially in low data regime. To address this issue, we propose Task-guided Disentangled Tuning (TDT) for PLMs, which enhances the generalization of representations by disentangling task-relevant signals from the entangled representations. For a given task, we introduce a learnable confidence model to detect indicative guidance from context, and further propose a disentangled regularization to mitigate the over-reliance problem. Experimental results on GLUE and CLUE benchmarks show that TDT gives consistently better results than fine-tuning with different PLMs, and extensive analysis demonstrates the effectiveness and robustness of our method. Code is available at https://github.com/lemon0830/TDT.
Anthology ID:
2022.findings-acl.247
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3126–3137
Language:
URL:
https://aclanthology.org/2022.findings-acl.247
DOI:
10.18653/v1/2022.findings-acl.247
Bibkey:
Cite (ACL):
Jiali Zeng, Yufan Jiang, Shuangzhi Wu, Yongjing Yin, and Mu Li. 2022. Task-guided Disentangled Tuning for Pretrained Language Models. In Findings of the Association for Computational Linguistics: ACL 2022, pages 3126–3137, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Task-guided Disentangled Tuning for Pretrained Language Models (Zeng et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.247.pdf
Code
 lemon0830/tdt
Data
CLUECMNLIGLUE