Contrastive Learning-based Sentence Encoders Implicitly Weight Informative Words

Hiroto Kurita, Goro Kobayashi, Sho Yokoi, Kentaro Inui


Abstract
The performance of sentence encoders can be significantly improved through the simple practice of fine-tuning using contrastive loss. A natural question arises: what characteristics do models acquire during contrastive learning? This paper theoretically and experimentally shows that contrastive-based sentence encoders implicitly weight words based on information-theoretic quantities; that is, more informative words receive greater weight, while others receive less. The theory states that, in the lower bound of the optimal value of the contrastive learning objective, the norm of word embedding reflects the information gain associated with the distribution of surrounding words. We also conduct comprehensive experiments using various models, multiple datasets, two methods to measure the implicit weighting of models (Integrated Gradients and SHAP), and two information-theoretic quantities (information gain and self-information). The results provide empirical evidence that contrastive fine-tuning emphasizes informative words.
Anthology ID:
2023.findings-emnlp.729
Volume:
Findings of the Association for Computational Linguistics: EMNLP 2023
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
10932–10947
Language:
URL:
https://aclanthology.org/2023.findings-emnlp.729
DOI:
10.18653/v1/2023.findings-emnlp.729
Bibkey:
Cite (ACL):
Hiroto Kurita, Goro Kobayashi, Sho Yokoi, and Kentaro Inui. 2023. Contrastive Learning-based Sentence Encoders Implicitly Weight Informative Words. In Findings of the Association for Computational Linguistics: EMNLP 2023, pages 10932–10947, Singapore. Association for Computational Linguistics.
Cite (Informal):
Contrastive Learning-based Sentence Encoders Implicitly Weight Informative Words (Kurita et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-emnlp.729.pdf
Video:
 https://aclanthology.org/2023.findings-emnlp.729.mp4