Learning Analogy-Preserving Sentence Embeddings for Answer Selection

Aïssatou Diallo, Markus Zopf, Johannes Fürnkranz


Abstract
Answer selection aims at identifying the correct answer for a given question from a set of potentially correct answers. Contrary to previous works, which typically focus on the semantic similarity between a question and its answer, our hypothesis is that question-answer pairs are often in analogical relation to each other. Using analogical inference as our use case, we propose a framework and a neural network architecture for learning dedicated sentence embeddings that preserve analogical properties in the semantic space. We evaluate the proposed method on benchmark datasets for answer selection and demonstrate that our sentence embeddings indeed capture analogical properties better than conventional embeddings, and that analogy-based question answering outperforms a comparable similarity-based technique.
Anthology ID:
K19-1085
Volume:
Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL)
Month:
November
Year:
2019
Address:
Hong Kong, China
Editors:
Mohit Bansal, Aline Villavicencio
Venue:
CoNLL
SIG:
SIGNLL
Publisher:
Association for Computational Linguistics
Note:
Pages:
910–919
Language:
URL:
https://aclanthology.org/K19-1085
DOI:
10.18653/v1/K19-1085
Bibkey:
Cite (ACL):
Aïssatou Diallo, Markus Zopf, and Johannes Fürnkranz. 2019. Learning Analogy-Preserving Sentence Embeddings for Answer Selection. In Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL), pages 910–919, Hong Kong, China. Association for Computational Linguistics.
Cite (Informal):
Learning Analogy-Preserving Sentence Embeddings for Answer Selection (Diallo et al., CoNLL 2019)
Copy Citation:
PDF:
https://aclanthology.org/K19-1085.pdf
Data
WikiQA