In Mobile Edge Computing (MEC), cloud computing services extend to the network's edge and close to end-users, and applications run at the network's edge. Users' requests should be spread equally among edge servers to minimize delay and response time to these requests, particularly in healthcare contexts. In order to achieve this, we propose a load balancing method for the balanced distribution of requests in MEC, in which to reduce the edge server overload, additional load processing can be assigned to the edge server that has more capacity. The suggested load balancing problem is presented as a Markov decision-making process (MDP) based on the MEC environment, to achieve the desired performance, which uses reinforcement learning to avoid the overload on edge servers and reduce the response time to emergency requests. The load balancing problem is simulated using iFogSim. The simulation results demonstrate that the suggested load balancing method has better performance in average execution delay, load balancing, and average response time than other methods and applies to healthcare and emergency scenarios.
Mohammad Esmaeil EsmaeiliAhmad KhonsariMahdi Dolati
Fenghui ZhangYuhang JiangXiancun ZhouXuecai BaoLiqing Shan
Mohammad Esmaeil EsmaeiliAhmad KhonsariVahid SohrabiAresh Dadlani
Fenghui ZhangYu‐Hang JiangXuecai BaoXiancun ZhouYu ZongXiaohu LiangKun Yang