Learning Semantic Textual Similarity via Topic-informed Discrete Latent Variables

Erxin Yu, Lan Du, Yuan Jin, Zhepei Wei, Yi Chang


Abstract
Recently, discrete latent variable models have received a surge of interest in both Natural Language Processing (NLP) and Computer Vision (CV), attributed to their comparable performance to the continuous counterparts in representation learning, while being more interpretable in their predictions. In this paper, we develop a topic-informed discrete latent variable model for semantic textual similarity, which learns a shared latent space for sentence-pair representation via vector quantization. Compared with previous models limited to local semantic contexts, our model can explore richer semantic information via topic modeling. We further boost the performance of semantic similarity by injecting the quantized representation into a transformer-based language model with a well-designed semantic-driven attention mechanism. We demonstrate, through extensive experiments across various English language datasets, that our model is able to surpass several strong neural baselines in semantic textual similarity tasks.
Anthology ID:
2022.emnlp-main.328
Volume:
Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2022
Address:
Abu Dhabi, United Arab Emirates
Editors:
Yoav Goldberg, Zornitsa Kozareva, Yue Zhang
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
4937–4948
Language:
URL:
https://aclanthology.org/2022.emnlp-main.328
DOI:
10.18653/v1/2022.emnlp-main.328
Bibkey:
Cite (ACL):
Erxin Yu, Lan Du, Yuan Jin, Zhepei Wei, and Yi Chang. 2022. Learning Semantic Textual Similarity via Topic-informed Discrete Latent Variables. In Proceedings of the 2022 Conference on Empirical Methods in Natural Language Processing, pages 4937–4948, Abu Dhabi, United Arab Emirates. Association for Computational Linguistics.
Cite (Informal):
Learning Semantic Textual Similarity via Topic-informed Discrete Latent Variables (Yu et al., EMNLP 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.emnlp-main.328.pdf