JOURNAL ARTICLE

Knowledge-Enhanced Multi-Behaviour Contrastive Learning for Effective Recommendation

Abstract

Real-world recommendation scenarios usually need to handle diverse user-item interaction behaviours, including page views, adding items into carts, and purchasing activities. The interactions that precede the actual target behaviour (e.g., purchasing an item) allow to capture the user’s preferences from different angles, and are used as auxiliary information (e.g., page views) to enrich the system’s knowledge about the users’ preferences, thereby helping to enhance recommendation for the target behaviour. Despite efforts in modelling the users’ multi-behaviour interaction information, the existing multi-behaviour recommenders still face two challenges: (1) Data sparsity across multiple user behaviours is a common issue that limits the recommendation performance, particularly for the target behaviour, which typically exhibits fewer interactions compared to other auxiliary behaviours; (2) Noisy auxiliary interactive behaviours where the information in the auxiliary behaviours might be non-relevant for recommendation. In this case, a direct adoption of contrastive learning between the target behaviour and the auxiliary behaviours will amplify the noise in the auxiliary behaviours, thereby negatively impacting the real semantics that can be derived from the target behaviour. To address these two challenges, we propose a new model called Knowledge-Enhanced Multi-behaviour Contrastive Learning for Recommendation (KEMCL). In particular, to address the problem of sparse user multi-behaviour interaction information, we leverage a dual-perspective knowledge encoding componentthat enriches the semantic representations of items, and generate supervision signals through self-supervised learning so as to enhance recommendation. In addition, we develop a cross-behaviour learning component, which includes two contrastive learning (CL) methods, inter CL and intra CL, to alleviate the problem of noisy auxiliary interactions. Extensive experiments on three public recommendation datasets show that our proposed KEMCL model significantly outperforms seven existing state-of-the-art methods. In particular, our KEMCL model outperforms the best baseline, namely KMCLR, by 5.42% on the large Tmall dataset.

Keywords:
Computer science Contrastive analysis Natural language processing Artificial intelligence Knowledge management Linguistics

Metrics

4
Cited By
6.11
FWCI (Field Weighted Citation Impact)
36
Refs
0.94
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

BOOK-CHAPTER

MLCLR: Multi-Level Contrastive Learning Enhanced Knowledge Graph-Based Recommendation

Yachao CuiMengmeng Zhang

Lecture notes in computer science Year: 2025 Pages: 219-230
JOURNAL ARTICLE

Enhanced knowledge graph recommendation algorithm based on multi-level contrastive learning

Rong ZhangYuan LiuXin Yang

Journal:   Scientific Reports Year: 2024 Vol: 14 (1)Pages: 23051-23051
JOURNAL ARTICLE

GCN-diffusion and multi-view contrastive learning for enhanced knowledge recommendation

Tao XueLu LiuWen LvLong Xi

Journal:   Journal of Intelligent Information Systems Year: 2025
© 2026 ScienceGate Book Chapters — All rights reserved.