Peijie SunLe WuKun ZhangXiangzhi ChenMeng Wang
While effective in recommendation tasks, collaborative filtering (CF)\ntechniques face the challenge of data sparsity. Researchers have begun\nleveraging contrastive learning to introduce additional self-supervised signals\nto address this. However, this approach often unintentionally distances the\ntarget user/item from their collaborative neighbors, limiting its efficacy. In\nresponse, we propose a solution that treats the collaborative neighbors of the\nanchor node as positive samples within the final objective loss function. This\npaper focuses on developing two unique supervised contrastive loss functions\nthat effectively combine supervision signals with contrastive loss. We analyze\nour proposed loss functions through the gradient lens, demonstrating that\ndifferent positive samples simultaneously influence updating the anchor node's\nembeddings. These samples' impact depends on their similarities to the anchor\nnode and the negative samples. Using the graph-based collaborative filtering\nmodel as our backbone and following the same data augmentation methods as the\nexisting contrastive learning model SGL, we effectively enhance the performance\nof the recommendation model. Our proposed Neighborhood-Enhanced Supervised\nContrastive Loss (NESCL) model substitutes the contrastive loss function in SGL\nwith our novel loss function, showing marked performance improvement. On three\nreal-world datasets, Yelp2018, Gowalla, and Amazon-Book, our model surpasses\nthe original SGL by 10.09%, 7.09%, and 35.36% on NDCG@20, respectively.\n
Shipeng Song斌 刘Fei TengTianrui Li
Zihan LinChangxin TianYupeng HouWayne Xin Zhao
X. M. XiaWenming MaJ. ZhangEn Zhang
Wanling CaiJiongbin ZhengWeike PanJing LinLin LiLi ChenXiaogang PengZhong Ming