Language Models and Semantic Relations: A Dual Relationship

Olivier Ferret


Abstract
Since they rely on the distributional hypothesis, static and contextual language models are closely linked to lexical semantic relations. In this paper, we exploit this link for enhancing a BERT model. More precisely, we propose to extract lexical semantic relations with two unsupervised methods, one based on a static language model, the other on a contextual model, and to inject the extracted relations into a BERT model for improving its semantic capabilities. Through various evaluations performed for English and focusing on semantic similarity at the word and sentence levels, we show the interest of this approach, allowing us to semantically enrich a BERT model without using any external semantic resource.
Anthology ID:
2024.lrec-main.878
Volume:
Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)
Month:
May
Year:
2024
Address:
Torino, Italia
Editors:
Nicoletta Calzolari, Min-Yen Kan, Veronique Hoste, Alessandro Lenci, Sakriani Sakti, Nianwen Xue
Venues:
LREC | COLING
SIG:
Publisher:
ELRA and ICCL
Note:
Pages:
10046–10057
Language:
URL:
https://aclanthology.org/2024.lrec-main.878
DOI:
Bibkey:
Cite (ACL):
Olivier Ferret. 2024. Language Models and Semantic Relations: A Dual Relationship. In Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), pages 10046–10057, Torino, Italia. ELRA and ICCL.
Cite (Informal):
Language Models and Semantic Relations: A Dual Relationship (Ferret, LREC-COLING 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.lrec-main.878.pdf