JOURNAL ARTICLE

Locker: Locally Constrained Self-Attentive Sequential Recommendation

Abstract

Recently, self-attentive models have shown promise in sequential recommendation, given their potential to capture user long-term preferences and short-term dynamics simultaneously. Despite their success, we argue that self-attention modules, as a non-local operator, often fail to capture short-term user dynamics accurately due to a lack of inductive local bias. To examine our hypothesis, we conduct an analytical experiment on controlled 'short-term' scenarios. We observe a significant performance gap between self-attentive recommenders with and without local constraints, which implies that short-term user dynamics are not sufficiently learned by existing self-attentive recommenders. Motivated by this observation, we propose a simple framework, (Locker) for self-attentive recommenders in a plug-and-play fashion. By combining the proposed local encoders with existing global attention heads, Locker enhances short-term user dynamics modeling, while retaining the long-term semantics captured by standard self-attentive encoders. We investigate Locker with five different local methods, outperforming state-of-the-art self-attentive recom- menders on three datasets by 17.19% ([email protected]) on average.

Keywords:
Term (time) Computer science Dynamics (music) Semantics (computer science) Encoder Recommender system Artificial intelligence Operator (biology) Simple (philosophy) Machine learning

Metrics

50
Cited By
12.23
FWCI (Field Weighted Citation Impact)
14
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Topic Modeling
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Bandit Algorithms Research
Social Sciences →  Decision Sciences →  Management Science and Operations Research
© 2026 ScienceGate Book Chapters — All rights reserved.