JOURNAL ARTICLE

Sequential Recommendation with Relation-Aware Kernelized Self-Attention

Mingi JiWeonyoung JooKyungwoo SongYoon-Yeong KimIl‐Chul Moon

Year: 2020 Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Vol: 34 (04)Pages: 4304-4311   Publisher: Association for the Advancement of Artificial Intelligence

Abstract

Recent studies identified that sequential Recommendation is improved by the attention mechanism. By following this development, we propose Relation-Aware Kernelized Self-Attention (RKSA) adopting a self-attention mechanism of the Transformer with augmentation of a probabilistic model. The original self-attention of Transformer is a deterministic measure without relation-awareness. Therefore, we introduce a latent space to the self-attention, and the latent space models the recommendation context from relation as a multivariate skew-normal distribution with a kernelized covariance matrix from co-occurrences, item characteristics, and user information. This work merges the self-attention of the Transformer and the sequential recommendation by adding a probabilistic model of the recommendation task specifics. We experimented RKSA over the benchmark datasets, and RKSA shows significant improvements compared to the recent baseline models. Also, RKSA were able to produce a latent space model that answers the reasons for recommendation.

Keywords:
Computer science Probabilistic logic Machine learning Recommender system Skew Latent Dirichlet allocation Relation (database) Artificial intelligence Transformer Data mining Topic model Engineering

Metrics

30
Cited By
7.26
FWCI (Field Weighted Citation Impact)
36
Refs
0.97
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.