Statistically Significant Detection of Semantic Shifts using Contextual Word Embeddings

Yang Liu, Alan Medlar, Dorota Glowacka


Abstract
Detecting lexical semantic change in smaller data sets, e.g. in historical linguistics and digital humanities, is challenging due to a lack of statistical power. This issue is exacerbated by non-contextual embedding models that produce one embedding per word and, therefore, mask the variability present in the data. In this article, we propose an approach to estimate semantic shift by combining contextual word embeddings with permutation-based statistical tests. We use the false discovery rate procedure to address the large number of hypothesis tests being conducted simultaneously. We demonstrate the performance of this approach in simulation where it achieves consistently high precision by suppressing false positives. We additionally analyze real-world data from SemEval-2020 Task 1 and the Liverpool FC subreddit corpus. We show that by taking sample variation into account, we can improve the robustness of individual semantic shift estimates without degrading overall performance.
Anthology ID:
2021.eval4nlp-1.11
Volume:
Proceedings of the 2nd Workshop on Evaluation and Comparison of NLP Systems
Month:
November
Year:
2021
Address:
Punta Cana, Dominican Republic
Editors:
Yang Gao, Steffen Eger, Wei Zhao, Piyawat Lertvittayakumjorn, Marina Fomicheva
Venue:
Eval4NLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
104–113
Language:
URL:
https://aclanthology.org/2021.eval4nlp-1.11
DOI:
10.18653/v1/2021.eval4nlp-1.11
Bibkey:
Cite (ACL):
Yang Liu, Alan Medlar, and Dorota Glowacka. 2021. Statistically Significant Detection of Semantic Shifts using Contextual Word Embeddings. In Proceedings of the 2nd Workshop on Evaluation and Comparison of NLP Systems, pages 104–113, Punta Cana, Dominican Republic. Association for Computational Linguistics.
Cite (Informal):
Statistically Significant Detection of Semantic Shifts using Contextual Word Embeddings (Liu et al., Eval4NLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.eval4nlp-1.11.pdf
Video:
 https://aclanthology.org/2021.eval4nlp-1.11.mp4