Simple Question Answering by Attentive Convolutional Neural Network

Wenpeng Yin, Mo Yu, Bing Xiang, Bowen Zhou, Hinrich Schütze


Abstract
This work focuses on answering single-relation factoid questions over Freebase. Each question can acquire the answer from a single fact of form (subject, predicate, object) in Freebase. This task, simple question answering (SimpleQA), can be addressed via a two-step pipeline: entity linking and fact selection. In fact selection, we match the subject entity in a fact candidate with the entity mention in the question by a character-level convolutional neural network (char-CNN), and match the predicate in that fact with the question by a word-level CNN (word-CNN). This work makes two main contributions. (i) A simple and effective entity linker over Freebase is proposed. Our entity linker outperforms the state-of-the-art entity linker over SimpleQA task. (ii) A novel attentive maxpooling is stacked over word-CNN, so that the predicate representation can be matched with the predicate-focused question representation more effectively. Experiments show that our system sets new state-of-the-art in this task.
Anthology ID:
C16-1164
Volume:
Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers
Month:
December
Year:
2016
Address:
Osaka, Japan
Venue:
COLING
SIG:
Publisher:
The COLING 2016 Organizing Committee
Note:
Pages:
1746–1756
Language:
URL:
https://aclanthology.org/C16-1164
DOI:
Bibkey:
Cite (ACL):
Wenpeng Yin, Mo Yu, Bing Xiang, Bowen Zhou, and Hinrich Schütze. 2016. Simple Question Answering by Attentive Convolutional Neural Network. In Proceedings of COLING 2016, the 26th International Conference on Computational Linguistics: Technical Papers, pages 1746–1756, Osaka, Japan. The COLING 2016 Organizing Committee.
Cite (Informal):
Simple Question Answering by Attentive Convolutional Neural Network (Yin et al., COLING 2016)
Copy Citation:
PDF:
https://aclanthology.org/C16-1164.pdf
Data
ParalexSimpleQuestions