Evaluating Bilingual Lexicon Induction without Lexical Data

Michaela Denisová, Pavel Rychly


Abstract
Bilingual Lexicon Induction (BLI) is a fundamental task in cross-lingual word embedding (CWE) evaluation, aimed at retrieving word translations from monolingual corpora in two languages. Despite the task’s central role, existing evaluation datasets based on lexical data often contain biases such as a lack of morphological diversity, frequency skew, semantic leakage, and overrepresentation of proper names, which undermine the validity of reported performance. In this paper, we propose a novel, language-agnostic evaluation methodology that entirely eliminates the dependency on lexical data. By training two sets of monolingual word embeddings (MWEs) using identical data and algorithms but with different weight initialisations, we enable the assessment on the BLI task without being affected by the quality of the evaluation dataset. We evaluate three baseline CWE models and analyse the impact of key hyperparameters. Our results provide a more reliable and bias-free perspective on CWE models’ performance.
Anthology ID:
2025.ranlp-1.34
Volume:
Proceedings of the 15th International Conference on Recent Advances in Natural Language Processing - Natural Language Processing in the Generative AI Era
Month:
September
Year:
2025
Address:
Varna, Bulgaria
Editors:
Galia Angelova, Maria Kunilovskaya, Marie Escribe, Ruslan Mitkov
Venue:
RANLP
SIG:
Publisher:
INCOMA Ltd., Shoumen, Bulgaria
Note:
Pages:
275–282
Language:
URL:
https://aclanthology.org/2025.ranlp-1.34/
DOI:
Bibkey:
Cite (ACL):
Michaela Denisová and Pavel Rychly. 2025. Evaluating Bilingual Lexicon Induction without Lexical Data. In Proceedings of the 15th International Conference on Recent Advances in Natural Language Processing - Natural Language Processing in the Generative AI Era, pages 275–282, Varna, Bulgaria. INCOMA Ltd., Shoumen, Bulgaria.
Cite (Informal):
Evaluating Bilingual Lexicon Induction without Lexical Data (Denisová & Rychly, RANLP 2025)
Copy Citation:
PDF:
https://aclanthology.org/2025.ranlp-1.34.pdf