Laha AleNing ZhangXiaojie FangXianfu ChenShaohua WuLongzhuang Li
Internet of Things (IoT) is considered as the enabling platform for a variety\nof promising applications, such as smart transportation and smart city, where\nmassive devices are interconnected for data collection and processing. These\nIoT applications pose a high demand on storage and computing capacity, while\nthe IoT devices are usually resource-constrained. As a potential solution,\nmobile edge computing (MEC) deploys cloud resources in the proximity of IoT\ndevices so that their requests can be better served locally. In this work, we\ninvestigate computation offloading in a dynamic MEC system with multiple edge\nservers, where computational tasks with various requirements are dynamically\ngenerated by IoT devices and offloaded to MEC servers in a time-varying\noperating environment (e.g., channel condition changes over time). The\nobjective of this work is to maximize the completed tasks before their\nrespective deadlines and minimize energy consumption. To this end, we propose\nan end-to-end Deep Reinforcement Learning (DRL) approach to select the best\nedge server for offloading and allocate the optimal computational resource such\nthat the expected long-term utility is maximized. The simulation results are\nprovided to demonstrate that the proposed approach outperforms the existing\nmethods.\n
Huan ZhouKai JiangXuxun LiuXiuhua LiVictor C. M. Leung
Ying ChenNing ZhangYuan WuXuemin Shen
Israr Ali KhanXiaofeng TaoG. M. Shafiqur RahmanWaheed ur RehmanTabinda Salam
Israr Ali KhanXiaofeng TaoGohar RahmanWaheed ur RehmanTabinda Salam
Israr Ali KhanXiaofeng TaoG. M. Shafiqur RahmanWaheed ur RehmanTabinda Salam