AggGen: Ordering and Aggregating while Generating

Xinnuo Xu, Ondřej Dušek, Verena Rieser, Ioannis Konstas


Abstract
We present AggGen (pronounced ‘again’) a data-to-text model which re-introduces two explicit sentence planning stages into neural data-to-text systems: input ordering and input aggregation. In contrast to previous work using sentence planning, our model is still end-to-end: AggGen performs sentence planning at the same time as generating text by learning latent alignments (via semantic facts) between input representation and target text. Experiments on the WebNLG and E2E challenge data show that by using fact-based alignments our approach is more interpretable, expressive, robust to noise, and easier to control, while retaining the advantages of end-to-end systems in terms of fluency. Our code is available at https://github.com/XinnuoXu/AggGen.
Anthology ID:
2021.acl-long.113
Volume:
Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers)
Month:
August
Year:
2021
Address:
Online
Editors:
Chengqing Zong, Fei Xia, Wenjie Li, Roberto Navigli
Venues:
ACL | IJCNLP
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1419–1434
Language:
URL:
https://aclanthology.org/2021.acl-long.113
DOI:
10.18653/v1/2021.acl-long.113
Bibkey:
Cite (ACL):
Xinnuo Xu, Ondřej Dušek, Verena Rieser, and Ioannis Konstas. 2021. AggGen: Ordering and Aggregating while Generating. In Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Conference on Natural Language Processing (Volume 1: Long Papers), pages 1419–1434, Online. Association for Computational Linguistics.
Cite (Informal):
AggGen: Ordering and Aggregating while Generating (Xu et al., ACL-IJCNLP 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.acl-long.113.pdf
Video:
 https://aclanthology.org/2021.acl-long.113.mp4
Code
 XinnuoXu/AggGen
Data
WebNLG