Bootstrapped Unsupervised Sentence Representation Learning

Yan Zhang, Ruidan He, Zuozhu Liu, Lidong Bing, Haizhou Li


Abstract
As high-quality labeled data is scarce, unsupervised sentence representation learning has attracted much attention. In this paper, we propose a new framework with a two-branch Siamese Network which maximizes the similarity between two augmented views of each sentence. Specifically, given one augmented view of the input sentence, the online network branch is trained by predicting the representation yielded by the target network of the same sentence under another augmented view. Meanwhile, the target network branch is bootstrapped with a moving average of the online network. The proposed method significantly outperforms other state-of-the-art unsupervised methods on semantic textual similarity (STS) and classification tasks. It can be adopted as a post-training procedure to boost the performance of the supervised methods. We further extend our method for learning multilingual sentence representations and demonstrate its effectiveness on cross-lingual STS tasks. Our code is available at https://github.com/yanzhangnlp/BSL.
Anthology ID:
2021.acl-long.402
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
5168–5180
Language:
URL:
https://aclanthology.org/2021.acl-long.402
DOI:
10.18653/v1/2021.acl-long.402
Bibkey:
Cite (ACL):
Yan Zhang, Ruidan He, Zuozhu Liu, Lidong Bing, and Haizhou Li. 2021. Bootstrapped Unsupervised Sentence Representation Learning. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 5168–5180, Online. Association for Computational Linguistics.
Cite (Informal):
Bootstrapped Unsupervised Sentence Representation Learning (Zhang et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.402.pdf
Video:
 https://aclanthology.org/2021.acl-long.402.mp4
Code
 yanzhangnlp/bsl
Data
BookCorpusMultiNLISICKSNLISentEval