On Isotropy, Contextualization and Learning Dynamics of Contrastive-based Sentence Representation Learning

Chenghao Xiao, Yang Long, Noura Al Moubayed


Abstract
Incorporating contrastive learning objectives in sentence representation learning (SRL) has yielded significant improvements on many sentence-level NLP tasks. However, it is not well understood why contrastive learning works for learning sentence-level semantics. In this paper, we aim to help guide future designs of sentence representation learning methods by taking a closer look at contrastive SRL through the lens of isotropy, contextualization and learning dynamics. We interpret its successes through the geometry of the representation shifts and show that contrastive learning brings isotropy, and drives high intra-sentence similarity: when in the same sentence, tokens converge to similar positions in the semantic space. We also find that what we formalize as “spurious contextualization” is mitigated for semantically meaningful tokens, while augmented for functional ones. We find that the embedding space is directed towards the origin during training, with more areas now better defined. We ablate these findings by observing the learning dynamics with different training temperatures, batch sizes and pooling methods.
Anthology ID:
2023.findings-acl.778
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12266–12283
Language:
URL:
https://aclanthology.org/2023.findings-acl.778
DOI:
10.18653/v1/2023.findings-acl.778
Bibkey:
Cite (ACL):
Chenghao Xiao, Yang Long, and Noura Al Moubayed. 2023. On Isotropy, Contextualization and Learning Dynamics of Contrastive-based Sentence Representation Learning. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12266–12283, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
On Isotropy, Contextualization and Learning Dynamics of Contrastive-based Sentence Representation Learning (Xiao et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.778.pdf
Video:
 https://aclanthology.org/2023.findings-acl.778.mp4