Emotion-Conditioned Text Generation through Automatic Prompt Optimization

Yarik Menchaca Resendiz, Roman Klinger


Abstract
Conditional natural language generation methods often require either expensive fine-tuning or training a large language model from scratch. Both are unlikely to lead to good results without a substantial amount of data and computational resources. Prompt learning without changing the parameters of a large language model presents a promising alternative. It is a cost-effective approach, while still achieving competitive results. While this procedure is now established for zero- and few-shot text classification and structured prediction, it has received limited attention in conditional text generation. We present the first automatic prompt optimization approach for emotion-conditioned text generation with instruction-fine-tuned models. Our method uses an iterative optimization procedure that changes the prompt by adding, removing, or replacing tokens. As objective function, we only require a text classifier that measures the realization of the conditional variable in the generated text. We evaluate the method on emotion-conditioned text generation with a focus on event reports and compare it to manually designed prompts that also act as the seed for the optimization procedure. The optimized prompts achieve 0.75 macro-average F1 to fulfill the emotion condition in contrast to manually designed seed prompts with only 0.22 macro-average F1.
Anthology ID:
2023.tllm-1.3
Volume:
Proceedings of the 1st Workshop on Taming Large Language Models: Controllability in the era of Interactive Assistants!
Month:
September
Year:
2023
Address:
Prague, Czech Republic
Editors:
Devamanyu Hazarika, Xiangru Robert Tang, Di Jin
Venues:
TLLM | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
24–30
Language:
URL:
https://aclanthology.org/2023.tllm-1.3
DOI:
Bibkey:
Cite (ACL):
Yarik Menchaca Resendiz and Roman Klinger. 2023. Emotion-Conditioned Text Generation through Automatic Prompt Optimization. In Proceedings of the 1st Workshop on Taming Large Language Models: Controllability in the era of Interactive Assistants!, pages 24–30, Prague, Czech Republic. Association for Computational Linguistics.
Cite (Informal):
Emotion-Conditioned Text Generation through Automatic Prompt Optimization (Resendiz & Klinger, TLLM-WS 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.tllm-1.3.pdf