CERET: Cost-Effective Extrinsic Refinement for Text Generation

Jason Cai, Hang Su, Monica Sunkara, Igor Shalyminov, Saab Mansour


Abstract
Large Language Models (LLMs) are powerful models for generation tasks, but they may not generate good quality outputs in their first attempt. Apart from model fine-tuning, existing approaches to improve prediction accuracy and quality typically involve LLM self-improvement / self-reflection that incorporate feedback from models themselves. Despite their effectiveness, these methods are hindered by their high computational cost and lack of scalability. In this work, we propose CERET, a method for refining text generations by considering semantic stability, entailment and inter-sample uncertainty measures. Experimental results show that CERET outperforms Self-consistency and Self-rerank baselines consistently under various task setups, by 1.6% in Rouge-1 for abstractive summarization and 3.5% in hit rate for question answering. Compared to LLM Self-rerank method, our approach only requires 9.4% of its latency and is more cost-effective.
Anthology ID:
2024.naacl-long.409
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Kevin Duh, Helena Gomez, Steven Bethard
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
7370–7383
Language:
URL:
https://aclanthology.org/2024.naacl-long.409
DOI:
Bibkey:
Cite (ACL):
Jason Cai, Hang Su, Monica Sunkara, Igor Shalyminov, and Saab Mansour. 2024. CERET: Cost-Effective Extrinsic Refinement for Text Generation. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 1: Long Papers), pages 7370–7383, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
CERET: Cost-Effective Extrinsic Refinement for Text Generation (Cai et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-long.409.pdf
Copyright:
 2024.naacl-long.409.copyright.pdf