JOURNAL ARTICLE

Context-Aware User Profiling and Multimedia Content Classification for Smart Devices

Abstract

Current solutions for delivering adapted multimedia content in mobile environments take into account only a limited set of contextual information, and can be regarded as passive solutions. We propose a new solution that anticipates user's needs based on the contexts of use and preferences to deliver media content to mobile users. This paper describes the profiling approach of the proposed solution, and a context-aware content-based recommendation for smart devices. Recommendations are issued based on user history, driven by real-time contextual conditions.

Keywords:
Profiling (computer programming) Computer science Multimedia Mobile device Context (archaeology) Set (abstract data type) World Wide Web Media content Human–computer interaction

Metrics

5
Cited By
2.42
FWCI (Field Weighted Citation Impact)
20
Refs
0.91
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Video Analysis and Summarization
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition
Multimedia Communication and Technology
Social Sciences →  Social Sciences →  Sociology and Political Science

Related Documents

BOOK-CHAPTER

Modelling Context-Aware Multimedia Community Content on Mobile Devices

Diana Weiß

Lecture notes in computer science Year: 2007 Pages: 834-844
JOURNAL ARTICLE

User-Driven Multimedia Adaptation Framework for Context-aware Learning Content Service

Atchara RueangprathumSomchai LimsiroratanaSuntorn Witosurapot

Journal:   Journal of Advances in Information Technology Year: 2016 Vol: 7 (3)Pages: 182-185
BOOK-CHAPTER

Context-Aware Modeling of Multimedia Content

Encyclopedia of Multimedia Year: 2008 Pages: 118-119
BOOK-CHAPTER

Context-Aware Modeling of Multimedia Content

Encyclopedia of Multimedia Year: 2006 Pages: 125-126
© 2026 ScienceGate Book Chapters — All rights reserved.