BOOK-CHAPTER

Towards Context-Aware Social Recommendation via Trust Networks

Xin Liu

Year: 2013 Lecture notes in computer science Pages: 121-134   Publisher: Springer Science+Business Media

Abstract

Utilizing social network information to improve recommendation quality has recently attracted much attention. However, most existing social recommendation models cannot well handle the heterogeneity and diversity of the social relationships (e.g., different friends may have different recommendations on the same items in different situations). Furthermore, few models take into account (non-social) contextual information, which has been proved to be another valuable information source for accurate recommendation. In this paper, we propose to construct trust networks on top of a social network to measure the quality of a friend's recommendations in different contexts. We employ random walk to collect the most relevant ratings based on the multi-dimensional trustworthiness of users in the trust network. Factorization machines model is then applied on the collected ratings to predict missing ratings considering various contexts. Evaluation based on a real dataset demonstrates that our approach improves the accuracy of the state-of-the-art social, context-aware and trust-aware recommendation models by at least 5.54% and 9.15% in terms of MAE and RMSE respectively.

Keywords:
Social trust Context (archaeology) Computer science Internet privacy Business Sociology Social capital Geography Social science Archaeology

Metrics

22
Cited By
4.62
FWCI (Field Weighted Citation Impact)
28
Refs
0.95
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Caching and Content Delivery
Physical Sciences →  Computer Science →  Computer Networks and Communications
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.