Xuan Wu


2022

pdf bib
Logical Form Generation via Multi-task Learning for Complex Question Answering over Knowledge Bases
Xixin Hu | Xuan Wu | Yiheng Shu | Yuzhong Qu
Proceedings of the 29th International Conference on Computational Linguistics

Question answering over knowledge bases (KBQA) for complex questions is a challenging task in natural language processing. Recently, generation-based methods that translate natural language questions to executable logical forms have achieved promising performance. These methods use auxiliary information to augment the logical form generation of questions with unseen KB items or novel combinations, but the noise introduced can also leads to more incorrect results. In this work, we propose GMT-KBQA, a Generation-based KBQA method via Multi-Task learning, to better retrieve and utilize auxiliary information. GMT-KBQA first obtains candidate entities and relations through dense retrieval, and then introduces a multi-task model which jointly learns entity disambiguation, relation classification, and logical form generation. Experimental results show that GMT-KBQA achieves state-of-the-art results on both ComplexWebQuestions and WebQuestionsSP datasets. Furthermore, the detailed evaluation demonstrates that GMT-KBQA benefits from the auxiliary tasks and has a strong generalization capability.

2016

pdf bib
Enhanced Personalized Search using Social Data
Dong Zhou | Séamus Lawless | Xuan Wu | Wenyu Zhao | Jianxun Liu
Proceedings of the 2016 Conference on Empirical Methods in Natural Language Processing