%0 Conference Proceedings %T InfoCSE: Information-aggregated Contrastive Learning of Sentence Embeddings %A Wu, Xing %A Gao, Chaochen %A Lin, Zijia %A Han, Jizhong %A Wang, Zhongyuan %A Hu, Songlin %Y Goldberg, Yoav %Y Kozareva, Zornitsa %Y Zhang, Yue %S Findings of the Association for Computational Linguistics: EMNLP 2022 %D 2022 %8 December %I Association for Computational Linguistics %C Abu Dhabi, United Arab Emirates %F wu-etal-2022-infocse %X Contrastive learning has been extensively studied in sentence embedding learning, which assumes that the embeddings of different views of the same sentence are closer. The constraint brought by this assumption is weak, and a good sentence representation should also be able to reconstruct the original sentence fragments. Therefore, this paper proposes an information-aggregated contrastive learning framework for learning unsupervised sentence embeddings, termed InfoCSE.InfoCSE forces the representation of [CLS] positions to aggregate denser sentence information by introducing an additional Masked language model task and a well-designed network. We evaluate the proposed InfoCSE on several benchmark datasets w.r.t the semantic text similarity (STS) task. Experimental results show that InfoCSE outperforms SimCSE by an average Spearman correlation of 2.60% on BERT-base, and 1.77% on BERT-large, achieving state-of-the-art results among unsupervised sentence representation learning methods. %R 10.18653/v1/2022.findings-emnlp.223 %U https://aclanthology.org/2022.findings-emnlp.223 %U https://doi.org/10.18653/v1/2022.findings-emnlp.223 %P 3060-3070