Rephrasing Invokes Better Generations for Large Language Models

Haoran Yang, Hongyuan Lu, Wai Lam


Abstract
In the realm of emerging multitasking abilities of Large language models (LLMs), methodologies like prompt tuning enable low-cost adaptation to downstream tasks without retraining the model. However, automatic input pre-processing when LLMs are unavailable is currently under-studied. This paper proposes ReLLM (Rephrasing for LLMs), a method that automatically paraphrases input content for better output generations. ReLLM replaces low-frequency lexical items with their high-frequency counterparts. This substitution is particularly beneficial for low-resource language tasks that lack sufficient training data and resources. ReLLM is user-friendly and requires no additional LLM training. Experimental results in cross-lingual summarization, and natural language inference demonstrate the effectiveness of ReLLM.
Anthology ID:
2024.naacl-srw.2
Volume:
Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 4: Student Research Workshop)
Month:
June
Year:
2024
Address:
Mexico City, Mexico
Editors:
Yang (Trista) Cao, Isabel Papadimitriou, Anaelia Ovalle, Marcos Zampieri, Francis Ferraro, Swabha Swayamdipta
Venue:
NAACL
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
8–15
Language:
URL:
https://aclanthology.org/2024.naacl-srw.2
DOI:
10.18653/v1/2024.naacl-srw.2
Bibkey:
Cite (ACL):
Haoran Yang, Hongyuan Lu, and Wai Lam. 2024. Rephrasing Invokes Better Generations for Large Language Models. In Proceedings of the 2024 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies (Volume 4: Student Research Workshop), pages 8–15, Mexico City, Mexico. Association for Computational Linguistics.
Cite (Informal):
Rephrasing Invokes Better Generations for Large Language Models (Yang et al., NAACL 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.naacl-srw.2.pdf