Studying the Impact of Filling Information Gaps on the Output Quality of Neural Data-to-Text

Craig Thomson, Zhijie Zhao, Somayajulu Sripada


Abstract
It is unfair to expect neural data-to-text to produce high quality output when there are gaps between system input data and information contained in the training text. Thomson et al. (2020) identify and narrow information gaps in Rotowire, a popular data-to-text dataset. In this paper, we describe a study which finds that a state-of-the-art neural data-to-text system produces higher quality output, according to the information extraction (IE) based metrics, when additional input data is carefully selected from this newly available source. It remains to be shown, however, whether IE metrics used in this study correlate well with humans in judging text quality.
Anthology ID:
2020.inlg-1.6
Volume:
Proceedings of the 13th International Conference on Natural Language Generation
Month:
December
Year:
2020
Address:
Dublin, Ireland
Editors:
Brian Davis, Yvette Graham, John Kelleher, Yaji Sripada
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
35–40
Language:
URL:
https://aclanthology.org/2020.inlg-1.6
DOI:
10.18653/v1/2020.inlg-1.6
Bibkey:
Cite (ACL):
Craig Thomson, Zhijie Zhao, and Somayajulu Sripada. 2020. Studying the Impact of Filling Information Gaps on the Output Quality of Neural Data-to-Text. In Proceedings of the 13th International Conference on Natural Language Generation, pages 35–40, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Studying the Impact of Filling Information Gaps on the Output Quality of Neural Data-to-Text (Thomson et al., INLG 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.inlg-1.6.pdf
Code
 nlgcat/adding_data
Data
RotoWireSportSett