Hierarchical Recurrent Aggregative Generation for Few-Shot NLG

Giulio Zhou, Gerasimos Lampouras, Ignacio Iacobacci


Abstract
Large pretrained models enable transfer learning to low-resource domains for language generation tasks. However, previous end-to-end approaches do not account for the fact that some generation sub-tasks, specifically aggregation and lexicalisation, can benefit from transfer learning in different extents. To exploit these varying potentials for transfer learning, we propose a new hierarchical approach for few-shot and zero-shot generation. Our approach consists of a three-moduled jointly trained architecture: the first module independently lexicalises the distinct units of information in the input as sentence sub-units (e.g. phrases), the second module recurrently aggregates these sub-units to generate a unified intermediate output, while the third module subsequently post-edits it to generate a coherent and fluent final text. We perform extensive empirical analysis and ablation studies on few-shot and zero-shot settings across 4 datasets. Automatic and human evaluation shows that the proposed hierarchical approach is consistently capable of achieving state-of-the-art results when compared to previous work.
Anthology ID:
2022.findings-acl.170
Volume:
Findings of the Association for Computational Linguistics: ACL 2022
Month:
May
Year:
2022
Address:
Dublin, Ireland
Editors:
Smaranda Muresan, Preslav Nakov, Aline Villavicencio
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
2167–2181
Language:
URL:
https://aclanthology.org/2022.findings-acl.170
DOI:
10.18653/v1/2022.findings-acl.170
Bibkey:
Cite (ACL):
Giulio Zhou, Gerasimos Lampouras, and Ignacio Iacobacci. 2022. Hierarchical Recurrent Aggregative Generation for Few-Shot NLG. In Findings of the Association for Computational Linguistics: ACL 2022, pages 2167–2181, Dublin, Ireland. Association for Computational Linguistics.
Cite (Informal):
Hierarchical Recurrent Aggregative Generation for Few-Shot NLG (Zhou et al., Findings 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.findings-acl.170.pdf
Video:
 https://aclanthology.org/2022.findings-acl.170.mp4
Data
SGD