Sequential Modelling of the Evolution of Word Representations for Semantic Change Detection

Adam Tsakalidis, Maria Liakata


Abstract
Semantic change detection concerns the task of identifying words whose meaning has changed over time. Current state-of-the-art approaches operating on neural embeddings detect the level of semantic change in a word by comparing its vector representation in two distinct time periods, without considering its evolution through time. In this work, we propose three variants of sequential models for detecting semantically shifted words, effectively accounting for the changes in the word representations over time. Through extensive experimentation under various settings with synthetic and real data we showcase the importance of sequential modelling of word vectors through time for semantic change detection. Finally, we compare different approaches in a quantitative manner, demonstrating that temporal modelling of word representations yields a clear-cut advantage in performance.
Anthology ID:
2020.emnlp-main.682
Volume:
Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP)
Month:
November
Year:
2020
Address:
Online
Editors:
Bonnie Webber, Trevor Cohn, Yulan He, Yang Liu
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8485–8497
Language:
URL:
https://aclanthology.org/2020.emnlp-main.682
DOI:
10.18653/v1/2020.emnlp-main.682
Bibkey:
Cite (ACL):
Adam Tsakalidis and Maria Liakata. 2020. Sequential Modelling of the Evolution of Word Representations for Semantic Change Detection. In Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP), pages 8485–8497, Online. Association for Computational Linguistics.
Cite (Informal):
Sequential Modelling of the Evolution of Word Representations for Semantic Change Detection (Tsakalidis & Liakata, EMNLP 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.emnlp-main.682.pdf
Video:
 https://slideslive.com/38938897