Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity

Hamza Harkous, Isabel Groves, Amir Saffari


Abstract
End-to-end neural data-to-text (D2T) generation has recently emerged as an alternative to pipeline-based architectures. However, it has faced challenges generalizing to new domains and generating semantically consistent text. In this work, we present DataTuner, a neural, end-to-end data-to-text generation system that makes minimal assumptions about the data representation and target domain. We take a two-stage generation-reranking approach, combining a fine-tuned language model with a semantic fidelity classifier. Each component is learnt end-toe-nd without needing dataset-specific heuristics, entity delexicalization, or post-processing. We show that DataTuner achieves state of the art results on automated metrics across four major D2T datasets (LDC2017T10, WebNLG, ViGGO, and Cleaned E2E), with fluency assessed by human annotators as nearing or exceeding the human-written reference texts. Our generated text has better semantic fidelity than the state of the art on these datasets. We further demonstrate that our model-based semantic fidelity scorer is a better assessment tool compared to traditional heuristic-based measures of semantic accuracy.
Anthology ID:
2020.coling-main.218
Volume:
Proceedings of the 28th International Conference on Computational Linguistics
Month:
December
Year:
2020
Address:
Barcelona, Spain (Online)
Editors:
Donia Scott, Nuria Bel, Chengqing Zong
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
2410–2424
Language:
URL:
https://aclanthology.org/2020.coling-main.218
DOI:
10.18653/v1/2020.coling-main.218
Bibkey:
Cite (ACL):
Hamza Harkous, Isabel Groves, and Amir Saffari. 2020. Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity. In Proceedings of the 28th International Conference on Computational Linguistics, pages 2410–2424, Barcelona, Spain (Online). International Committee on Computational Linguistics.
Cite (Informal):
Have Your Text and Use It Too! End-to-End Neural Data-to-Text Generation with Semantic Fidelity (Harkous et al., COLING 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.coling-main.218.pdf
Code
 amazon-research/datatuner
Data
E2ELDC2017T10ViGGOWebNLG