Domain Generalization via Switch Knowledge Distillation for Robust Review Representation

You Zhang, Jin Wang, Liang-Chih Yu, Dan Xu, Xuejie Zhang


Abstract
Applying neural models injected with in-domain user and product information to learn review representations of unseen or anonymous users incurs an obvious obstacle in content-based recommender systems. For the generalization of the in-domain classifier, most existing models train an extra plain-text model for the unseen domain. Without incorporating historical user and product information, such a schema makes unseen and anonymous users dissociate from the recommender system. To simultaneously learn the review representation of both existing and unseen users, this study proposed a switch knowledge distillation for domain generalization. A generalization-switch (GSwitch) model was initially applied to inject user and product information by flexibly encoding both domain-invariant and domain-specific features. By turning the status ON or OFF, the model introduced a switch knowledge distillation to learn a robust review representation that performed well for either existing or anonymous unseen users. The empirical experiments were conducted on IMDB, Yelp-2013, and Yelp-2014 by masking out users in test data as unseen and anonymous users. The comparative results indicate that the proposed method enhances the generalization capability of several existing baseline models. For reproducibility, the code for this paper is available at: https://github.com/yoyo-yun/DG_RRR.
Anthology ID:
2023.findings-acl.810
Volume:
Findings of the Association for Computational Linguistics: ACL 2023
Month:
July
Year:
2023
Address:
Toronto, Canada
Editors:
Anna Rogers, Jordan Boyd-Graber, Naoaki Okazaki
Venue:
Findings
SIG:
Publisher:
Association for Computational Linguistics
Note:
Pages:
12812–12826
Language:
URL:
https://aclanthology.org/2023.findings-acl.810
DOI:
10.18653/v1/2023.findings-acl.810
Bibkey:
Cite (ACL):
You Zhang, Jin Wang, Liang-Chih Yu, Dan Xu, and Xuejie Zhang. 2023. Domain Generalization via Switch Knowledge Distillation for Robust Review Representation. In Findings of the Association for Computational Linguistics: ACL 2023, pages 12812–12826, Toronto, Canada. Association for Computational Linguistics.
Cite (Informal):
Domain Generalization via Switch Knowledge Distillation for Robust Review Representation (Zhang et al., Findings 2023)
Copy Citation:
PDF:
https://aclanthology.org/2023.findings-acl.810.pdf
Video:
 https://aclanthology.org/2023.findings-acl.810.mp4