Jiyuan Feng
2025
FedCSR: A Federated Framework for Multi-Platform Cross-Domain Sequential Recommendation with Dual Contrastive Learning
Dongyi Zheng
|
Hongyu Zhang
|
Jianyang Zhai
|
Lin Zhong
|
Lingzhi Wang
|
Jiyuan Feng
|
Xiangke Liao
|
Yonghong Tian
|
Nong Xiao
|
Qing Liao
Proceedings of the 31st International Conference on Computational Linguistics
Cross-domain sequential recommendation (CSR) has garnered significant attention. Current federated frameworks for CSR leverage information across multiple domains but often rely on user alignment, which increases communication costs and privacy risks. In this work, we propose FedCSR, a novel federated cross-domain sequential recommendation framework that eliminates the need for user alignment between platforms. FedCSR fully utilizes cross-domain knowledge to address the key challenges related to data heterogeneity both inter- and intra-platform. To tackle the heterogeneity of data patterns between platforms, we introduce Model Contrastive Learning (MCL) to reduce the gap between local and global models. Additionally, we design Sequence Contrastive Learning (SCL) to address the heterogeneity of user preferences across different domains within a platform by employing tailored sequence augmentation techniques. Extensive experiments conducted on multiple real-world datasets demonstrate that FedCSR achieves superior performance compared to existing baseline methods.
Search
Fix data
Co-authors
- Xiangke Liao 1
- Qing Liao 1
- Yonghong Tian 1
- Lingzhi Wang 1
- Nong Xiao 1
- show all...