Sentence Representation Learning with Generative Objective rather than Contrastive Objective

Bohong Wu, Hai Zhao


Abstract
Though offering amazing contextualized token-level representations, current pre-trained language models take less attention on accurately acquiring sentence-level representation during their self-supervised pre-training. However, contrastive objectives which dominate the current sentence representation learning bring little linguistic interpretability and no performance guarantee on downstream semantic tasks. We instead propose a novel generative self-supervised learning objective based on phrase reconstruction. To overcome the drawbacks of previous generative methods, we carefully model intra-sentence structure by breaking down one sentence into pieces of important phrases. Empirical studies show that our generative learning achieves powerful enough performance improvement and outperforms the current state-of-the-art contrastive methods not only on the STS benchmarks, but also on downstream semantic retrieval and reranking tasks. Our code is available at https://github.com/chengzhipanpan/PaSeR.
Anthology ID:
2022.emnlp-main.221
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
3356–3368
Language:
URL:
https://aclanthology.org/2022.emnlp-main.221
DOI:
10.18653/v1/2022.emnlp-main.221
Bibkey:
Cite (ACL):
Bohong Wu and Hai Zhao. 2022. Sentence Representation Learning with Generative Objective rather than Contrastive Objective. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 3356–3368, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Sentence Representation Learning with Generative Objective rather than Contrastive Objective (Wu & Zhao, EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.221.pdf