Transformer based Natural Language Generation for Question-Answering

Imen Akermi, Johannes Heinecke, Frédéric Herledan


Abstract
This paper explores Natural Language Generation within the context of Question-Answering task. The several works addressing this task only focused on generating a short answer or a long text span that contains the answer, while reasoning over a Web page or processing structured data. Such answers’ length are usually not appropriate as the answer tend to be perceived as too brief or too long to be read out loud by an intelligent assistant. In this work, we aim at generating a concise answer for a given question using an unsupervised approach that does not require annotated data. Tested over English and French datasets, the proposed approach shows very promising results.
Anthology ID:
2020.inlg-1.41
Volume:
Proceedings of the 13th International Conference on Natural Language Generation
Month:
December
Year:
2020
Address:
Dublin, Ireland
Editors:
Brian Davis, Yvette Graham, John Kelleher, Yaji Sripada
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
349–359
Language:
URL:
https://aclanthology.org/2020.inlg-1.41
DOI:
10.18653/v1/2020.inlg-1.41
Bibkey:
Cite (ACL):
Imen Akermi, Johannes Heinecke, and Frédéric Herledan. 2020. Transformer based Natural Language Generation for Question-Answering. In Proceedings of the 13th International Conference on Natural Language Generation, pages 349–359, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Transformer based Natural Language Generation for Question-Answering (Akermi et al., INLG 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.inlg-1.41.pdf
Data
Universal Dependencies