Chun LiuShengxin LiKexin ShiHaorong ChenShanglin LiDimitris Kanellopoulos
Mobile devices face challenges in running compute-intensive applications due to limited battery, storage, and processing power. Edge computing alleviates this by offloading tasks to edge servers or cloud data centers. However, task scheduling in dynamic environments, influenced by fluctuating factors like wireless channel quality and battery levels, remains a complex problem. This paper presents a reinforcement learning-based approach to optimize task scheduling and resource allocation, aiming to minimize application completion time under energy constraints. We propose Q-learning and Deep Q-Network (DQN) algorithms to tackle the problem. Experimental results show that DQN outperforms traditional algorithms like greedy and random scheduling, particularly under varying energy constraints, highlighting its effectiveness in dynamic edge computing scenarios.
Peisong LiXinheng WangChangle LiMuddesar IqbalAnwer Al‐DulaimiI Chih‐LinPablo Casaseca‐de‐la‐Higuera
Tianjian ChenZengwei LyuXiaohui YuanZhenchun WeiLei ShiYuqi Fan
Xie QianyuXutao YangLaixin ChiXuejie ZhangJixian Zhang
Liang HuangFeng XuLiping QianYuan Wu
Jianan ZhaoXiaohui HuXinxin Du