Solving Hard Analogy Questions with Relation Embedding Chains

Nitesh Kumar, Steven Schockaert


Abstract
Modelling how concepts are related is a central topic in Lexical Semantics. A common strategy is to rely on knowledge graphs (KGs) such as ConceptNet, and to model the relation between two concepts as a set of paths. However, KGs are limited to a fixed set of relation types, and they are incomplete and often noisy. Another strategy is to distill relation embeddings from a fine-tuned language model. However, this is less suitable for words that are only indirectly related and it does not readily allow us to incorporate structured domain knowledge. In this paper, we aim to combine the best of both worlds. We model relations as paths but associate their edges with relation embeddings. The paths are obtained by first identifying suitable intermediate words and then selecting those words for which informative relation embeddings can be obtained. We empirically show that our proposed representations are useful for solving hard analogy questions.
Anthology ID:
2023.emnlp-main.382
Volume:
Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing
Month:
December
Year:
2023
Address:
Singapore
Editors:
Houda Bouamor, Juan Pino, Kalika Bali
Venue:
EMNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
6224–6236
Language:
URL:
https://aclanthology.org/2023.emnlp-main.382
DOI:
10.18653/v1/2023.emnlp-main.382
Bibkey:
Cite (ACL):
Nitesh Kumar and Steven Schockaert. 2023. Solving Hard Analogy Questions with Relation Embedding Chains. In Proceedings of the 2023 Conference on Empirical Methods in Natural Language Processing, pages 6224–6236, Singapore. Association for Computational Linguistics.
Cite (Informal):
Solving Hard Analogy Questions with Relation Embedding Chains (Kumar & Schockaert, EMNLP 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.emnlp-main.382.pdf
Video:
 https://aclanthology.org/2023.emnlp-main.382.mp4