Qu Yincen
2022
How Can Cross-lingual Knowledge Contribute Better to Fine-Grained Entity Typing?
Hailong Jin
|
Tiansi Dong
|
Lei Hou
|
Juanzi Li
|
Hui Chen
|
Zelin Dai
|
Qu Yincen
Findings of the Association for Computational Linguistics: ACL 2022
Cross-lingual Entity Typing (CLET) aims at improving the quality of entity type prediction by transferring semantic knowledge learned from rich-resourced languages to low-resourced languages. In this paper, by utilizing multilingual transfer learning via the mixture-of-experts approach, our model dynamically capture the relationship between target language and each source language, and effectively generalize to predict types of unseen entities in new languages. Extensive experiments on multi-lingual datasets show that our method significantly outperforms multiple baselines and can robustly handle negative transfer. We questioned the relationship between language similarity and the performance of CLET. A series of experiments refute the commonsense that the more source the better, and suggest the Similarity Hypothesis for CLET.
Search
Co-authors
- Hailong Jin 1
- Tiansi Dong 1
- Lei Hou 1
- Juanzi Li 1
- Hui Chen 1
- show all...