Mobile Edge Computing (MEC) has emerged as a pivotal technology to meet the increasing demands of mobile applications. However, in high-dynamic MEC environments, load balancing and performance optimization among servers remain challenging. Focusing on server load balancing in task offloading in MEC environment. It constructs a framework for ultra-dense network environments and formulates the problem of computation offloading and resource allocation as a Markov Decision Process (MDP). Subsequently, a learning algorithm based on Proximal Policy Optimization (PPO) is proposed to reduce load standard deviation, achieve load balancing, and simultaneously minimize the system's total delay energy consumption, thereby enhancing the efficiency of the MEC system. Simulation results demonstrate that, compared to random offloading strategies, all-offloading strategies, and the Deep Deterministic Policy Gradient algorithm, the algorithm proposed consistently demonstrates superior performance in load balancing across varying numbers of users and task sizes.
Hongzhi GuoJiajia LiuJie ZhangWen SunNei Kato
S KavyashreeH. A. Chaya Kumari
Huiji ZhengSicong YuXinyuan Qiu
Wei FengHao LiuYingbiao YaoDiqiu CaoMingxiong Zhao