Wenqiang Zhang
2021
Improving Zero-Shot Cross-lingual Transfer for Multilingual Question Answering over Knowledge Graph
Yucheng Zhou
|
Xiubo Geng
|
Tao Shen
|
Wenqiang Zhang
|
Daxin Jiang
Proceedings of the 2021 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies
Multilingual question answering over knowledge graph (KGQA) aims to derive answers from a knowledge graph (KG) for questions in multiple languages. To be widely applicable, we focus on its zero-shot transfer setting. That is, we can only access training data in a high-resource language, while need to answer multilingual questions without any labeled data in target languages. A straightforward approach is resorting to pre-trained multilingual models (e.g., mBERT) for cross-lingual transfer, but there is a still significant gap of KGQA performance between source and target languages. In this paper, we exploit unsupervised bilingual lexicon induction (BLI) to map training questions in source language into those in target language as augmented training data, which circumvents language inconsistency between training and inference. Furthermore, we propose an adversarial learning strategy to alleviate syntax-disorder of the augmented data, making the model incline to both language- and syntax-independence. Consequently, our model narrows the gap in zero-shot cross-lingual transfer. Experiments on two multilingual KGQA datasets with 11 zero-resource languages verify its effectiveness.
Modeling Event-Pair Relations in External Knowledge Graphs for Script Reasoning
Yucheng Zhou
|
Xiubo Geng
|
Tao Shen
|
Jian Pei
|
Wenqiang Zhang
|
Daxin Jiang
Findings of the Association for Computational Linguistics: ACL-IJCNLP 2021
Search