JOURNAL ARTICLE

Meta-Learned User Preference Estimator with Attention Network for Cold-Start Recommendation

Shilong LiuYang LiuXiaotong ZhangCheng XuJie HeYue Qi

Year: 2023 Journal:   Journal of Physics Conference Series Vol: 2504 (1)Pages: 012028-012028   Publisher: IOP Publishing

Abstract

Abstract One crucial challenge in the recommendation research field is the cold-start problem. Meta-learning is a feasible algorithm to reduce the error of cold-start recommendation because it can adjust to new tasks rapidly through relatively few updates. However, meta-learning does not take the diverse interests of users into account, which limits the performance improvement in cold-start scenarios. We proposed a recommendation model called attentional meta-learned user preference estimator that combines attention network and meta-learning. This method enhances the ability of modelling the personalized user interest by learning the weights between users and items based on attention mechanism, then improves the performance of cold-start recommendation. We validated the model with two publicly available datasets in the recommendation field. Compared with the three benchmark methods, the proposed model reduces the mean absolute error by at least 2.3% and the root mean square error of 2.5%.

Keywords:
Cold start (automotive) Benchmark (surveying) Computer science Estimator Mean squared error Preference Machine learning Field (mathematics) Artificial intelligence Recommender system Baseline (sea) Collaborative filtering Statistics Engineering Mathematics

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
4
Refs
0.09
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Advanced Bandit Algorithms Research
Social Sciences →  Decision Sciences →  Management Science and Operations Research
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.