Mobile cloud computing (MCC) has been extensively studied to provide pervasive healthcare services in a more affordable manner. Through offloading computation-intensive tasks from mobile to cloud, a significant portion of energy can be saved to extend the mobile battery life, which is critical to maintaining continuous and uninterrupted healthcare services. However, given the ever-changing clinical severity, personal demands, and environmental conditions, it is essential to explore context-aware approach capable of dynamically determining the optimal task offloading strategies and algorithmic settings, with the goal of achieving a balanced trade-off among energy efficiency, diagnostic accuracy, and processing latency. To this aim, we propose a model-free reinforcement learning based task scheduling approach to adapt to the changing requirements.
Cláudio ArouchaHigo Felipe Pires
Cláudio ArouchaHigo Felipe Pires
Aroucha, CláudioHigo Felipe Pires