Leveraging Large Pretrained Models for WebNLG 2020

Xintong Li, Aleksandre Maskharashvili, Symon Jory Stevens-Guille, Michael White


Abstract
In this paper, we report experiments on finetuning large pretrained models to realize resource description framework (RDF) triples to natural language. We provide the details of how to build one of the top-ranked English generation models in WebNLG Challenge 2020. We also show that there appears to be considerable potential for reranking to improve the current state of the art both in terms of statistical metrics and model-based metrics. Our human analyses of the generated texts show that for Russian, pretrained models showed some success, both in terms of lexical and morpho-syntactic choices for generation, as well as for content aggregation. Nevertheless, in a number of cases, the model can be unpredictable, both in terms of failure or success. Omissions of the content and hallucinations, which in many cases occurred at the same time, were major problems. By contrast, the models for English showed near perfect performance on the validation set.
Anthology ID:
2020.webnlg-1.12
Volume:
Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+)
Month:
12
Year:
2020
Address:
Dublin, Ireland (Virtual)
Editors:
Thiago Castro Ferreira, Claire Gardent, Nikolai Ilinykh, Chris van der Lee, Simon Mille, Diego Moussallem, Anastasia Shimorina
Venue:
WebNLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
117–124
Language:
URL:
https://aclanthology.org/2020.webnlg-1.12
DOI:
Bibkey:
Cite (ACL):
Xintong Li, Aleksandre Maskharashvili, Symon Jory Stevens-Guille, and Michael White. 2020. Leveraging Large Pretrained Models for WebNLG 2020. In Proceedings of the 3rd International Workshop on Natural Language Generation from the Semantic Web (WebNLG+), pages 117–124, Dublin, Ireland (Virtual). Association for Computational Linguistics.
Cite (Informal):
Leveraging Large Pretrained Models for WebNLG 2020 (Li et al., WebNLG 2020)
Copy Citation:
PDF:
https://aclanthology.org/2020.webnlg-1.12.pdf
Code
 znculee/webnlg2020
Data
DBpediaWebNLG