User Embedding Model for Personalized Language Prompting

Sumanth Doddapaneni, Krishna Sayana, Ambarish Jash, Sukhdeep Sodhi, Dima Kuzmin


Abstract
Modeling long user histories plays a pivotal role in enhancing recommendation systems, allowing to capture users’ evolving preferences, resulting in more precise and personalized recommendations. In this study, we tackle the challenges of modeling long user histories for preference understanding in natural language. Specifically, we introduce a new User Embedding Module (UEM) that efficiently processes user history in free-form text by compressing and representing them as embeddings, to use them as soft prompts to a language model (LM). Our experiments demonstrate the superior capability of this approach in handling significantly longer histories compared to conventional text-based methods, yielding substantial improvements in predictive performance. Models trained using our approach exhibit substantial enhancements, with up to 0.21 and 0.25 F1 points improvement over the text-based prompting baselines. The main contribution of this research is to demonstrate the ability to bias language models via user signals.
Anthology ID:
2024.personalize-1.12
Volume:
Proceedings of the 1st Workshop on Personalization of Generative AI Systems (PERSONALIZE 2024)
Month:
March
Year:
2024
Address:
St. Julians, Malta
Editors:
Ameet Deshpande, EunJeong Hwang, Vishvak Murahari, Joon Sung Park, Diyi Yang, Ashish Sabharwal, Karthik Narasimhan, Ashwin Kalyan
Venues:
PERSONALIZE | WS
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
124–131
Language:
URL:
https://aclanthology.org/2024.personalize-1.12
DOI:
Bibkey:
Cite (ACL):
Sumanth Doddapaneni, Krishna Sayana, Ambarish Jash, Sukhdeep Sodhi, and Dima Kuzmin. 2024. User Embedding Model for Personalized Language Prompting. In Proceedings of the 1st Workshop on Personalization of Generative AI Systems (PERSONALIZE 2024), pages 124–131, St. Julians, Malta. Association for Computational Linguistics.
Cite (Informal):
User Embedding Model for Personalized Language Prompting (Doddapaneni et al., PERSONALIZE-WS 2024)
Copy Citation:
PDF:
https://aclanthology.org/2024.personalize-1.12.pdf