An Information Minimization Based Contrastive Learning Model for Unsupervised Sentence Embeddings Learning

Shaobin Chen, Jie Zhou, Yuling Sun, Liang He


Abstract
Unsupervised sentence embeddings learning has been recently dominated by contrastive learning methods (e.g., SimCSE), which keep positive pairs similar and push negative pairs apart. The contrast operation aims to keep as much information as possible by maximizing the mutual information between positive instances, which leads to redundant information in sentence embedding. To address this problem, we present an information minimization based contrastive learning InforMin-CL model to retain the useful information and discard the redundant information by maximizing the mutual information and minimizing the information entropy between positive instances meanwhile for unsupervised sentence representation learning. Specifically, we find that information minimization can be achieved by simple contrast and reconstruction objectives. The reconstruction operation reconstitutes the positive instance via the other positive instance to minimize the information entropy between positive instances. We evaluate our model on fourteen downstream tasks, including both supervised and unsupervised (semantic textual similarity) tasks. Extensive experimental results show that our InforMin-CL obtains a state-of-the-art performance.
Anthology ID:
2022.coling-1.426
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Editors:
Nicoletta Calzolari, Chu-Ren Huang, Hansaem Kim, James Pustejovsky, Leo Wanner, Key-Sun Choi, Pum-Mo Ryu, Hsin-Hsi Chen, Lucia Donatelli, Heng Ji, Sadao Kurohashi, Patrizia Paggio, Nianwen Xue, Seokhwan Kim, Younggyun Hahm, Zhong He, Tony Kyungil Lee, Enrico Santus, Francis Bond, Seung-Hoon Na
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
4821–4831
Language:
URL:
https://aclanthology.org/2022.coling-1.426
DOI:
Bibkey:
Cite (ACL):
Shaobin Chen, Jie Zhou, Yuling Sun, and Liang He. 2022. An Information Minimization Based Contrastive Learning Model for Unsupervised Sentence Embeddings Learning. In Proceedings of the 29th International Conference on Computational Linguistics, pages 4821–4831, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
An Information Minimization Based Contrastive Learning Model for Unsupervised Sentence Embeddings Learning (Chen et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.426.pdf
Code
 bin199/informin-cl
Data
IMDb Movie ReviewsMPQA Opinion CorpusMRPCSSTSentEval