Haibo GeXing SongShixiong MaLinghuan LiuShun LiXutao ChengTing ZhouHaodong Feng
Mobile edge computing (MEC) provides users with computing offloading services by deploying computing resources near the edge of the network, In order to meet the needs of mobile devices(MD) for low-latency and low-energy computing tasks, traditional solving methods are difficult to obtain good results. Therefore, this paper designs a MEC system for multi-user computing offloading based on deep reinforcement learning algorithms, the mobile terminal service is modeled as the weighted sum of the delay and energy consumption during the entire operation of the service under reasonable constraints, and the total system cost (weighted sum of the delay and energy consumption) is the minimum optimization goal, proposed a deep Q learning algorithm, each mobile device makes unloading decisions without knowing the task model. In order to improve the performance of the algorithm, the model training and unloading decisions are placed at the ends of MEC and MD respectively. The simulation results show that the algorithm proposed in this paper can reduce the delay and energy consumption of service processing compared with other algorithms.
Haneul KoJoonwoo KimDongkyun RyooIn–Ho ChaSangheon Pack
Zhenjiang ZhangChen LiSheng‐Lung PengXintong Pei