Contrastive Learning of Sentence Representations

Hefei Qiu, Wei Ding, Ping Chen


Abstract
Learning sentence representations which capture rich semantic meanings has been crucial for many NLP tasks. Pre-trained language models such as BERT have achieved great success in NLP, but sentence embeddings extracted directly from these models do not perform well without fine-tuning. We propose Contrastive Learning of Sentence Representations (CLSR), a novel approach which applies contrastive learning to learn universal sentence representations on top of pre-trained language models. CLSR utilizes semantic similarity of two sentences to construct positive instance for contrastive learning. Semantic information that has been captured by the pre-trained models is kept by getting sentence embeddings from these models with proper pooling strategy. An encoder followed by a linear projection takes these embeddings as inputs and is trained under a contrastive objective. To evaluate the performance of CLSR, we run experiments on a range of pre-trained language models and their variants on a series of Semantic Contextual Similarity tasks. Results show that CLSR gains significant performance improvements over existing SOTA language models.
Anthology ID:
2021.icon-main.33
Volume:
Proceedings of the 18th International Conference on Natural Language Processing (ICON)
Month:
December
Year:
2021
Address:
National Institute of Technology Silchar, Silchar, India
Editors:
Sivaji Bandyopadhyay, Sobha Lalitha Devi, Pushpak Bhattacharyya
Venue:
ICON
SIG:
Publisher:
NLP Association of India (NLPAI)
Note:
Pages:
277–283
Language:
URL:
https://aclanthology.org/2021.icon-main.33
DOI:
Bibkey:
Cite (ACL):
Hefei Qiu, Wei Ding, and Ping Chen. 2021. Contrastive Learning of Sentence Representations. In Proceedings of the 18th International Conference on Natural Language Processing (ICON), pages 277–283, National Institute of Technology Silchar, Silchar, India. NLP Association of India (NLPAI).
Cite (Informal):
Contrastive Learning of Sentence Representations (Qiu et al., ICON 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.icon-main.33.pdf
Data
SNLI