Compressing Sentence Representation for Semantic Retrieval via Homomorphic Projective Distillation

Xuandong Zhao, Zhiguo Yu, Ming Wu, Lei Li


Abstract
How to learn highly compact yet effective sentence representation? Pre-trained language models have been effective in many NLP tasks. However, these models are often huge and produce large sentence embeddings. Moreover, there is a big performance gap between large and small models. In this paper, we propose Homomorphic Projective Distillation (HPD) to learn compressed sentence embeddings. Our method augments a small Transformer encoder model with learnable projection layers to produce compact representations while mimicking a large pre-trained language model to retain the sentence representation quality. We evaluate our method with different model sizes on both semantic textual similarity (STS) and semantic retrieval (SR) tasks. Experiments show that our method achieves 2.7-4.5 points performance gain on STS tasks compared with previous best representations of the same size. In SR tasks, our method improves retrieval speed (8.2×) and memory usage (8.0×) compared with state-of-the-art large models. Our implementation is available at https://github.com/XuandongZhao/HPD.
Anthology ID:
2022.findings-acl.64
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
774–781
Language:
URL:
https://aclanthology.org/2022.findings-acl.64
DOI:
10.18653/v1/2022.findings-acl.64
Bibkey:
Cite (ACL):
Xuandong Zhao, Zhiguo Yu, Ming Wu, and Lei Li. 2022. Compressing Sentence Representation for Semantic Retrieval via Homomorphic Projective Distillation. In Findings of the Association for Computational Linguistics: ACL 2022, pages 774–781, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Compressing Sentence Representation for Semantic Retrieval via Homomorphic Projective Distillation (Zhao et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.64.pdf
Code
 xuandongzhao/hpd
Data
MultiNLISICKSNLI