DistillCSE: Distilled Contrastive Learning for Sentence Embeddings

Jiahao Xu, Wei Shao, Lihui Chen, Lemao Liu


Abstract
This paper proposes the DistillCSE framework, which performs contrastive learning under the self-training paradigm with knowledge distillation. The potential advantage of DistillCSE is its self-enhancing feature: using a base model to provide additional supervision signals, a stronger model may be learned through knowledge distillation. However, the vanilla DistillCSE through the standard implementation of knowledge distillation only achieves marginal improvements. The quantitative analyses demonstrate its reason that the standard knowledge distillation exhibits a relatively large variance of the teacher model’s logits due to the essence of contrastive learning. To mitigate the issue induced by high variance, this paper accordingly proposed two simple yet effective solutions for knowledge distillation: a Group-P shuffling strategy as an implicit regularization and the averaging logits from multiple teacher components. Experiments on standard benchmarks demonstrate that the proposed DistillCSE outperforms many strong baseline methods and yields a new state-of-the-art performance.
Anthology ID:
2023.findings-emnlp.547
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8153–8165
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.547
DOI:
10.18653/v1/2023.findings-emnlp.547
Bibkey:
Cite (ACL):
Jiahao Xu, Wei Shao, Lihui Chen, and Lemao Liu. 2023. DistillCSE: Distilled Contrastive Learning for Sentence Embeddings. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 8153–8165, Singapore. Association for Computational Linguistics.
Cite (Informal):
DistillCSE: Distilled Contrastive Learning for Sentence Embeddings (Xu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.547.pdf