Wei-Peng Chen
2021
A Systematic Investigation of KB-Text Embedding Alignment at Scale
Vardaan Pahuja
|
Yu Gu
|
Wenhu Chen
|
Mehdi Bahrami
|
Lei Liu
|
Wei-Peng Chen
|
Yu Su
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Knowledge bases (KBs) and text often contain complementary knowledge: KBs store structured knowledge that can support long range reasoning, while text stores more comprehensive and timely knowledge in an unstructured way. Separately embedding the individual knowledge sources into vector spaces has demonstrated tremendous successes in encoding the respective knowledge, but how to jointly embed and reason with both knowledge sources to fully leverage the complementary information is still largely an open problem. We conduct a large-scale, systematic investigation of aligning KB and text embeddings for joint reasoning. We set up a novel evaluation framework with two evaluation tasks, few-shot link prediction and analogical reasoning, and evaluate an array of KB-text embedding alignment methods. We also demonstrate how such alignment can infuse textual information into KB embeddings for more accurate link prediction on emerging entities and events, using COVID-19 as a case study.
Search
Co-authors
- Vardaan Pahuja 1
- Yu Gu 1
- Wenhu Chen 1
- Mehdi Bahrami 1
- Lei Liu 1
- show all...
- Yu Su 1