Few-shot Table-to-text Generation with Prefix-Controlled Generator

Yutao Luo, Menghua Lu, Gongshen Liu, Shilin Wang


Abstract
Neural table-to-text generation approaches are data-hungry, limiting their adaption for low-resource real-world applications. Previous works mostly resort to Pre-trained Language Models (PLMs) to generate fluent summaries of a table. However, they often contain hallucinated contents due to the uncontrolled nature of PLMs. Moreover, the topological differences between tables and sequences are rarely studied. Last but not least, fine-tuning on PLMs with a handful of instances may lead to over-fitting and catastrophic forgetting. To alleviate these problems, we propose a prompt-based approach, Prefix-Controlled Generator (i.e., PCG), for few-shot table-to-text generation. We prepend a task-specific prefix for a PLM to make the table structure better fit the pre-trained input. In addition, we generate an input-specific prefix to control the factual contents and word order of the generated text. Both automatic and human evaluations on different domains (humans, books and songs) of the Wikibio dataset prove the effectiveness of our approach.
Anthology ID:
2022.coling-1.565
Volume:
Proceedings of the 29th International Conference on Computational Linguistics
Month:
October
Year:
2022
Address:
Gyeongju, Republic of Korea
Venue:
COLING
SIG:
Publisher:
International Committee on Computational Linguistics
Note:
Pages:
6493–6504
Language:
URL:
https://aclanthology.org/2022.coling-1.565
DOI:
Bibkey:
Cite (ACL):
Yutao Luo, Menghua Lu, Gongshen Liu, and Shilin Wang. 2022. Few-shot Table-to-text Generation with Prefix-Controlled Generator. In Proceedings of the 29th International Conference on Computational Linguistics, pages 6493–6504, Gyeongju, Republic of Korea. International Committee on Computational Linguistics.
Cite (Informal):
Few-shot Table-to-text Generation with Prefix-Controlled Generator (Luo et al., COLING 2022)
Copy Citation:
PDF:
https://aclanthology.org/2022.coling-1.565.pdf