Exploiting Rich Textual User-Product Context for Improving Personalized Sentiment Analysis

Chenyang Lyu, Linyi Yang, Yue Zhang, Yvette Graham, Jennifer Foster


Abstract
User and product information associated with a review is useful for sentiment polarity prediction. Typical approaches incorporating such information focus on modeling users and products as implicitly learned representation vectors. Most do not exploit the potential of historical reviews, or those that currently do require unnecessary modifications to model architectureor do not make full use of user/product associations. The contribution of this work is twofold: i) a method to explicitly employ historical reviews belonging to the same user/product in initializing representations, and ii) efficient incorporation of textual associations between users and products via a user-product cross-context module. Experiments on the IMDb, Yelp-2013 and Yelp-2014 English benchmarks with BERT, SpanBERT and Longformer pretrained language models show that our approach substantially outperforms previous state-of-the-art.
Anthology ID:
2023.findings-acl.92
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
1419–1429
Language:
URL:
https://aclanthology.org/2023.findings-acl.92
DOI:
10.18653/v1/2023.findings-acl.92
Bibkey:
Cite (ACL):
Chenyang Lyu, Linyi Yang, Yue Zhang, Yvette Graham, and Jennifer Foster. 2023. Exploiting Rich Textual User-Product Context for Improving Personalized Sentiment Analysis. In Findings of the Association for Computational Linguistics: ACL 2023, pages 1419–1429, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Exploiting Rich Textual User-Product Context for Improving Personalized Sentiment Analysis (Lyu et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.92.pdf
Video:
 https://aclanthology.org/2023.findings-acl.92.mp4