Baochun Li
2024
PepRec: Progressive Enhancement of Prompting for Recommendation
Yakun Yu
|
Shi-ang Qi
|
Baochun Li
|
Di Niu
Proceedings of the 2024 Conference on Empirical Methods in Natural Language Processing
With large language models (LLMs) achieving remarkable breakthroughs in natural language processing (NLP) domains, recent researchers have actively explored the potential of LLMs for recommendation systems by converting the input data into textual sentences through prompt templates. Although semantic knowledge from LLMs can help enrich the content information of items, to date it is still hard for them to achieve comparable performance to traditional deep learning recommendation models, partly due to a lack of ability to leverage collaborative filtering. In this paper, we propose a novel training-free prompting framework, PepRec, which aims to capture knowledge from both content-based filtering and collaborative filtering to boost recommendation performance with LLMs, while providing interpretation for the recommendation. Experiments based on two real-world datasets from different domains show that PepRec significantly outperforms various traditional deep learning recommendation models and prompt-based recommendation systems.
Search
Co-authors
- Yakun Yu 1
- Shi-ang Qi 1
- Di Niu 1