JOURNAL ARTICLE

Time interval-aware graph with self-attention for sequential recommendation

Abstract

Sequential recommendation, as a branch under the recommendation system, obtains the user's interest changes from the user's interaction history to predict the next item. The neural network structure, Transformer and Graph Neural Networks (GNN) have been widely used in recommendation systems due to their ability to represent sequences and capture high-order information. However, previous models only rank actions in the time order of occurrence, ignoring the effect of the time interval between adjacent actions, which usually reflects the user's preferences. To fully use time information, we design the Time Interval-aware Graph with Self-attention for sequential recommendation (TIGSA). Specifically, we first construct a time interval-aware graph, which integrates the information of different time intervals in all user action sequences. The time interval of two items determines the weight of each edge in the graph. Then the item model combined with the time interval information is obtained through the Graph Convolutional Networks (GCN). Finally, the self-attention block is used to adaptively compute the attention weights of the items in the sequence. Experiments show that our method outperforms other recommendation models on three public datasets and different evaluation metrics.

Keywords:
Computer science Interval (graph theory) Graph Interval graph Theoretical computer science Mathematics Line graph Combinatorics Pathwidth

Metrics

1
Cited By
0.38
FWCI (Field Weighted Citation Impact)
26
Refs
0.67
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Data Management and Algorithms
Physical Sciences →  Computer Science →  Signal Processing
© 2026 ScienceGate Book Chapters — All rights reserved.