Mobile Edge Computing (MEC) is one of the key enabling technologies for future 6G wireless networks that can provide lower latency service and more efficient resource utilization for future intelligent applications and the Internet of Things (IoT), while also reducing the energy consumption of end devices. In the intricate dynamic edge environment, the task offloading problem is entangled with several factors, such as the uncertainty of online tasks, the heterogeneity of edge servers, and the mobility of devices. In this paper, considering the randomness of online task arrivals, time-varying channels, and mobility of devices, a deep reinforcement learning-based online task offloading (DRL-OTO) algorithm is designed to minimize the energy consumption of all mobile devices. Specifically, by portraying the system model consisting of the communication model, energy consumption model, and node mobility model, the task offloading optimization problem is modeled as a mixed integer nonlinear programming (MINLP) problem. By decomposing this problem, each mobile device first determines the edge server to be offloaded, and then the DRL-OTO algorithm is designed by utilizing the DDPG method, in which each mobile device is able to determine the offloading rate. Simulation results show that the proposed DRL-OTO algorithm can achieve fast convergence and is able to reduce energy consumption, thus increasing the utility of all devices in the dynamic edge environment.
Haixing WuJingwei GengXiaojun BaiShunfu Jin
Junnan LiZhengyi YangKai ChenMing ZhaoXiuhua LiQilin FanJinlong HaoLuxi Cheng
Ziwen ShengYingchi MaoJiajun WangHua NieJianxin Huang
Wentao YangZ. LiuXiaowu LiuYuefeng Ma
Juan FangDezheng QuHuijie ChenYaqi Liu