%0 Conference Proceedings %T WhiteningBERT: An Easy Unsupervised Sentence Embedding Approach %A Huang, Junjie %A Tang, Duyu %A Zhong, Wanjun %A Lu, Shuai %A Shou, Linjun %A Gong, Ming %A Jiang, Daxin %A Duan, Nan %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Findings of the Association for Computational Linguistics: EMNLP 2021 %D 2021 %8 November %I Association for Computational Linguistics %C Punta Cana, Dominican Republic %F huang-etal-2021-whiteningbert-easy %X Producing the embedding of a sentence in anunsupervised way is valuable to natural language matching and retrieval problems in practice. In this work, we conduct a thorough examination of pretrained model based unsupervised sentence embeddings. We study on fourpretrained models and conduct massive experiments on seven datasets regarding sentence semantics. We have three main findings. First, averaging all tokens is better than only using [CLS] vector. Second, combining both topand bottom layers is better than only using toplayers. Lastly, an easy whitening-based vector normalization strategy with less than 10 linesof code consistently boosts the performance. The whole project including codes and data is publicly available at https://github.com/Jun-jie-Huang/WhiteningBERT. %R 10.18653/v1/2021.findings-emnlp.23 %U https://aclanthology.org/2021.findings-emnlp.23 %U https://doi.org/10.18653/v1/2021.findings-emnlp.23 %P 238-244