Mobile edge computing (MEC) has emerged as a promising computing paradigm that can provide computing resources to mobile devices. However, when there are massive tasks are not processed, the edge node will have excessive load, which will significantly impact the task execution and degrade the user experience of applications. Due to the dynamic nature of the load on the edge nodes, it is complex to make the best performance decision for each mobile device. In this paper, we investigate the optimization of resource allocation and task offloading decisions in MEC systems for non-divisible and latency-sensitive tasks, considering edge load dynamics and latency constraints. To determine offloading decisions and minimize the average system energy cost in a dynamic environment, we design a distributed offloading algorithm considering deep reinforcement learning. Specifically, we use long short-term memory to implement load prediction and combine reinforcement learning techniques (including priority replay buffers, double-DQN techniques, and dueling deep Q networks) to improve algorithm performance. Simulation results show that our proposed algorithm can better allocate resources to achieve lower average system energy consumption with guaranteed low latency compared to the three benchmark methods.
Wei FengHao LiuYingbiao YaoDiqiu CaoMingxiong Zhao
D. Adhimuga SivasakthiGunasekaran Raja
Fariba FarahbakhshAli ShahidinejadMostafa Ghobaei‐Arani
Xinxiang ZhangJigang WuWenjun ShiYalan WuYuqing Miu
Binwei WuJie ZengLu GeXin SuYouxi Tang