Mobile Edge Computing (MEC) extends computational capacity and storage resources to the network edge, catering to the realtime and low-latency application demands of mobile devices(MDs). However, offloading complex computational tasks to MEC servers presents a significant challenge due to the limitations of MD resources. This paper proposes a Deep Reinforcement Learning (DRL)-based computational offloading strategy aimed at addressing the offloading problem and minimizing latency. We construct an optimization model that aims to minimize system latency while taking into account constraints such as computational resources and energy consumption. Furthermore, we frame the computational offloading problem as a Markov Decision Process (MDP) and employ policy gradient methods to learn and optimize the model parameters. Experimental results validate the outstanding latency minimization capabilities of our DRL offloading strategy under different environments and task requirements, highlighting its efficiency and practicality compared to other conventional offloading strategies.
Md. Tarek HassanMd. Kamal Hosain
Yameng ZhangTong LiuYanmin ZhuYuanyuan Yang
Jie FengF. Richard YuQingqi PeiXiaoli ChuJianbo DuLi Zhu
Miaojiang ChenTian WangShaobo ZhangAnfeng Liu