Anastasios GiannopoulosIlias ParalikasSotirios SpantideasPanagiotis Trakadas
<div>\n<div>\n<div>\n<div>Cloud-Edge Computing Continuum (CEC) system, where edge and cloud nodes are seamlessly connected, is dedicated to handle substantial computational loads offloaded by end-users. These tasks can suffer from delays or be dropped entirely when deadlines are missed, particularly under fluctuating network conditions and resource limitations. The CEC is coupled with the need for hybrid task offloading, where the task placement decisions concern whether the tasks are processed locally, offloaded vertically to the cloud, or horizontally to interconnected edge servers. In this paper, we present a distributed hybrid task offloading scheme (HOODIE) designed to jointly optimize the tasks latency and drop rate, under dynamic CEC traffic. HOODIE employs a model-free deep reinforcement learning (DRL) framework, where distributed DRL agents at each edge server autonomously determine offloading decisions without global task distribution awareness. To further enhance the system pro-activity and learning stability, we incorporate techniques such as Long Short-term Memory (LSTM), Dueling deep Q-networks (DQN), and double-DQN. Extensive simulation results demonstrate that HOODIE effectively reduces task drop rates and average task processing delays, outperforming several baseline methods under changing CEC settings and dynamic conditions.</div>\n</div>\n</div>\n</div>
Ying ChenNing ZhangYuan WuXuemin Shen
Laha AleNing ZhangXiaojie FangXianfu ChenShaohua WuLongzhuang Li
Minyan ShiRui WangErwu LiuXu ZhixinLongwei Wang
Sonwani, BhaveshMarquering, Henk A.