%0 Conference Proceedings %T An Empirical Study on Multiple Information Sources for Zero-Shot Fine-Grained Entity Typing %A Chen, Yi %A Jiang, Haiyun %A Liu, Lemao %A Shi, Shuming %A Fan, Chuang %A Yang, Min %A Xu, Ruifeng %Y Moens, Marie-Francine %Y Huang, Xuanjing %Y Specia, Lucia %Y Yih, Scott Wen-tau %S Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing %D 2021 %8 November %I Association for Computational Linguistics %C Online and Punta Cana, Dominican Republic %F chen-etal-2021-empirical %X Auxiliary information from multiple sources has been demonstrated to be effective in zero-shot fine-grained entity typing (ZFET). However, there lacks a comprehensive understanding about how to make better use of the existing information sources and how they affect the performance of ZFET. In this paper, we empirically study three kinds of auxiliary information: context consistency, type hierarchy and background knowledge (e.g., prototypes and descriptions) of types, and propose a multi-source fusion model (MSF) targeting these sources. The performance obtains up to 11.42% and 22.84% absolute gains over state-of-the-art baselines on BBN and Wiki respectively with regard to macro F1 scores. More importantly, we further discuss the characteristics, merits and demerits of each information source and provide an intuitive understanding of the complementarity among them. %R 10.18653/v1/2021.emnlp-main.210 %U https://aclanthology.org/2021.emnlp-main.210 %U https://doi.org/10.18653/v1/2021.emnlp-main.210 %P 2668-2678