Fen HanNuo YuJiakai GongYuan GeXueshan Gao
With the development of new technologies, many mobile computing applications put forward higher requirements for the delay and network resources. However, resource-constrained mobile devices are struggling to meet these requirements. To address this issue, mobile edge computing (MEC) has emerged, which can schedule the computing tasks requested by users to the MEC servers, aiming to reduce the resource constraints on mobile devices. Since the bandwidth and computing resources of a single edge server are limited, each server can only serve a few number of users. Moreover, the coverage area of adjacent servers overlaps in real networks. Therefore, optimizing task scheduling is crucial to improve the performance of MEC in the face of numerous task requests. This paper investigates the task scheduling problem considering both delay and bandwidth resource constraints in a multi-user multi-edge server scenario. The optimization goal of this problem is to maximize the number of users that edge servers can serve. A deep reinforcement learning (DRL) approach is proposed to solve this problem. The DRL method can efficiently determine the connection decisions between the users and the base stations to maximize the reward. The proposed DRL method is compared with the Integer Linear Programming (ILP) method and a greedy heuristic method based on the nearest neighbor distribution principle. Simulation results show that the DRL method outperforms the existing methods and maintains superior performance even under multiple constraints.
Qi ZhaoMingjie FengLi LiYi LiHang LiuGenshe Chen
Xianghao MengDaichong ChaoQianying GuoXiaowei Li
Shuran ShengPeng ChenZhimin ChenLenan WuYuxuan Yao