Kexin LiXingwei WangQiang HeMingzhou YangMin HuangSchahram Dustdar
This article investigates how to enhance the Multi-access Edge Computing (MEC) systems performance with the aid of device-to-device (D2D) communication computation offloading. By adequately exploiting a novel computation offloading mechanism based on D2D collaboration, users can efficiently share computational resources with each other. However, it is challenging to distinguish valuable information that truly promotes a collaborative decision, as worthless information can hinder collaboration among users. In addition, the transmission of large volumes of information requires high bandwidth and incurs significant latency and computational complexity, resulting in unacceptable costs. In this article, we propose an efficient D2D-assisted MEC computation offloading framework based on Attention Communication Deep Reinforcement Learning (ACDRL), which simulates the interactions between related entities, including device-to-device collaboration in the horizontal and device-to-edge offloading in the vertical. Second, we developed a distributed cooperative reinforcement learning algorithm that includes an attention mechanism that skews computational resources towards active users to avoid unnecessary resource wastage in large-scale MEC systems. Finally, to improve the effectiveness and rationality of cooperation among users, we introduce a communication channel to integrate information from all users in a communication group, thus facilitating cooperative decision-making. The proposed framework is benchmarked, and the experimental results show that the proposed framework can effectively reduce latency and provide valuable insights for practical design compared to other baseline approaches.
Yuxuan LiuGeming XiaJian ChenDanlei Zhang
Mamoon M. SaeedRashid A. SaeedHashim ElshafieAla Eldin AwoudaZeinab E. AhmedMayada A. AhmedRania A. Mokhtar