Changwei WanSongtao GuoYuanyuan Yang
Computation offloading is an effective method to relieve user equipment (UE) from the limited battery capacity and computation resource in mobile edge computing (MEC) networks. However, it is challenging to obtain offloading strategy timely and accurately under diverse computation task requirements and changeable network channel states in multi-user and resource-constrained network environment. In this paper, we consider the network dynamics and UE's resource constraints and aim to minimize the energy consumption of all UEs by jointly optimizing the offloading decision, the central processing unit (CPU) frequency and the power split ratio in a dynamic MEC network. To be specific, we introduce simultaneous wireless information and power transmission (SWIPT) technology in MEC networks to prolong UE's operation time. More importantly, we propose an online computation offloading algorithm based on deep deterministic policy-gradient (DDPG), named Enhanced DDPG (EDDPG), to solve the energy consumption minimization problem. In particular, EDDPG can make real-time decisions without complete network information and adapt to time-varying environments and different requirements. Furthermore, we introduce the priority experience replay technology in EDDPG to accelerate the convergence by using experience tuples. Simulation results show that our proposed algorithm can effectively reduce the energy consumption of UEs and enable them complete more computing tasks within the time limit. Compared with other baseline methods, it can accelerate the convergence and improve the system performance effectively.
Liwei GengHongbo ZhaoHaoqiang LiuYujie WangWenquan FengLu Bai
Peiying ZhangYu T. SuBoxiao LiLei LiuCong WangWei ZhangLizhuang Tan
Senhu ZhouShihao FeiYingzhu Feng