Jinseok Seol
2024
Enhancing Large Language Model Based Sequential Recommender Systems with Pseudo Labels Reconstruction
Hyunsoo Na
|
Minseok Gang
|
Youngrok Ko
|
Jinseok Seol
|
Sang-goo Lee
Findings of the Association for Computational Linguistics: EMNLP 2024
Large language models (LLMs) are utilized in various studies, and they also demonstrate a potential to function independently as a recommendation model. Nevertheless, training sequences and text labels modifies LLMs’ pre-trained weights, diminishing their inherent strength in constructing and comprehending natural language sentences. In this study, we propose a reconstruction-based LLM recommendation model (ReLRec) that harnesses the feature extraction capability of LLMs, while preserving LLMs’ sentence generation abilities. We reconstruct the user and item pseudo-labels generated from user reviews, while training on sequential data, aiming to exploit the key features of both users and items. Experimental results demonstrate the efficacy of label reconstruction in sequential recommendation tasks.
2017
A Syllable-based Technique for Word Embeddings of Korean Words
Sanghyuk Choi
|
Taeuk Kim
|
Jinseok Seol
|
Sang-goo Lee
Proceedings of the First Workshop on Subword and Character Level Models in NLP
Word embedding has become a fundamental component to many NLP tasks such as named entity recognition and machine translation. However, popular models that learn such embeddings are unaware of the morphology of words, so it is not directly applicable to highly agglutinative languages such as Korean. We propose a syllable-based learning model for Korean using a convolutional neural network, in which word representation is composed of trained syllable vectors. Our model successfully produces morphologically meaningful representation of Korean words compared to the original Skip-gram embeddings. The results also show that it is quite robust to the Out-of-Vocabulary problem.
Search
Co-authors
- Sang-goo Lee 2
- Hyunsoo Na 1
- Minseok Gang 1
- Youngrok Ko 1
- Sanghyuk Choi 1
- show all...