SAPPHIRE: Approaches for Enhanced Concept-to-Text Generation

Steven Y. Feng, Jessica Huynh, Chaitanya Prasad Narisetty, Eduard Hovy, Varun Gangal


Abstract
We motivate and propose a suite of simple but effective improvements for concept-to-text generation called SAPPHIRE: Set Augmentation and Post-hoc PHrase Infilling and REcombination. We demonstrate their effectiveness on generative commonsense reasoning, a.k.a. the CommonGen task, through experiments using both BART and T5 models. Through extensive automatic and human evaluation, we show that SAPPHIRE noticeably improves model performance. An in-depth qualitative analysis illustrates that SAPPHIRE effectively addresses many issues of the baseline model generations, including lack of commonsense, insufficient specificity, and poor fluency.
Anthology ID:
2021.inlg-1.21
Volume:
Proceedings of the 14th International Conference on Natural Language Generation
Month:
August
Year:
2021
Address:
Aberdeen, Scotland, UK
Editors:
Anya Belz, Angela Fan, Ehud Reiter, Yaji Sripada
Venue:
INLG
SIG:
SIGGEN
Publisher:
Association for Computational Linguistics
Note:
Pages:
212–225
Language:
URL:
https://aclanthology.org/2021.inlg-1.21
DOI:
10.18653/v1/2021.inlg-1.21
Bibkey:
Cite (ACL):
Steven Y. Feng, Jessica Huynh, Chaitanya Prasad Narisetty, Eduard Hovy, and Varun Gangal. 2021. SAPPHIRE: Approaches for Enhanced Concept-to-Text Generation. In Proceedings of the 14th International Conference on Natural Language Generation, pages 212–225, Aberdeen, Scotland, UK. Association for Computational Linguistics.
Cite (Informal):
SAPPHIRE: Approaches for Enhanced Concept-to-Text Generation (Feng et al., INLG 2021)
Copy Citation:
PDF:
https://aclanthology.org/2021.inlg-1.21.pdf
Code
 styfeng/sapphire
Data
CommonGen