%0 Conference Proceedings %T A Contrastive Framework for Learning Sentence Representations from Pairwise and Triple-wise Perspective in Angular Space %A Zhang, Yuhao %A Zhu, Hongji %A Wang, Yongliang %A Xu, Nan %A Li, Xiaobo %A Zhao, Binqiang %Y Muresan, Smaranda %Y Nakov, Preslav %Y Villavicencio, Aline %S Proceedings of the 60th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) %D 2022 %8 May %I Association for Computational Linguistics %C Dublin, Ireland %F zhang-etal-2022-contrastive %X Learning high-quality sentence representations is a fundamental problem of natural language processing which could benefit a wide range of downstream tasks. Though the BERT-like pre-trained language models have achieved great success, using their sentence representations directly often results in poor performance on the semantic textual similarity task. Recently, several contrastive learning methods have been proposed for learning sentence representations and have shown promising results. However, most of them focus on the constitution of positive and negative representation pairs and pay little attention to the training objective like NT-Xent, which is not sufficient enough to acquire the discriminating power and is unable to model the partial order of semantics between sentences. So in this paper, we propose a new method ArcCSE, with training objectives designed to enhance the pairwise discriminative power and model the entailment relation of triplet sentences. We conduct extensive experiments which demonstrate that our approach outperforms the previous state-of-the-art on diverse sentence related tasks, including STS and SentEval. %R 10.18653/v1/2022.acl-long.336 %U https://aclanthology.org/2022.acl-long.336 %U https://doi.org/10.18653/v1/2022.acl-long.336 %P 4892-4903