Conditioning LLMs with Emotion in Neural Machine Translation

Charles Brazier, Jean-Luc Rouas


Abstract
Large Language Models (LLMs) have shown remarkable performance in Natural Language Processing tasks, including Machine Translation (MT). In this work, we propose a novel MT pipeline that integrates emotion information extracted from a Speech Emotion Recognition (SER) model into LLMs to enhance translation quality. We first fine-tune five existing LLMs on the Libri-trans dataset and select the most performant model. Subsequently, we augment LLM prompts with different dimensional emotions and train the selected LLM under these different configurations. Our experiments reveal that integrating emotion information, especially arousal, into LLM prompts leads to notable improvements in translation quality.
Anthology ID:
2024.iwslt-1.5
Volume:
Proceedings of the 21st International Conference on Spoken Language Translation (IWSLT 2024)
Month:
August
Year:
2024
Address:
Bangkok, Thailand (in-person and online)
Editors:
Elizabeth Salesky, Marcello Federico, Marine Carpuat
Venue:
IWSLT
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
33–38
Language:
URL:
https://aclanthology.org/2024.iwslt-1.5
DOI:
Bibkey:
Cite (ACL):
Charles Brazier and Jean-Luc Rouas. 2024. Conditioning LLMs with Emotion in Neural Machine Translation. In Proceedings of the 21st International Conference on Spoken Language Translation (IWSLT 2024), pages 33–38, Bangkok, Thailand (in-person and online). Association for Computational Linguistics.
Cite (Informal):
Conditioning LLMs with Emotion in Neural Machine Translation (Brazier & Rouas, IWSLT 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.iwslt-1.5.pdf