In this paper, we propose a novel approach for optimal resource management and caching in ultra-reliable low-latency communication (URLLC)-enabled Internet of Vehicles (IoV) networks. The proposed framework includes mobile edge computing (MEC) servers integrated into roadside units (RSUs), unmanned aerial vehicles (UAVs), and base stations (BSs) for hybrid vehicle-to-vehicle (V2V) and vehicle-to-infrastructure (V2I) communication. To enhance the accuracy of the global model while considering the mobility characteristics of vehicles, we leverage an asynchronous federated learning (AFL) algorithm. The problem of optimal resource allocation is formulated to achieve the best allocation of frequency, computation, and caching resources while complying with the delay restrictions. To solve the non-convex problem, a multi-agent actor-critic type deep reinforcement learning algorithm called DMAAC algorithm is introduced. Additionally, a cooperative caching scheme based on the AFL framework called Co-Ca is proposed, utilizing a Dueling Deep-Q-Network (DDQN) to predict frequently accessed contents and cache them efficiently. Extensive simulation results show the effectiveness of the proposed framework and algorithms compared to existing schemes.
Bishmita HazarikaKeshav SinghSandeep Kumar SinghCunhua PanTrung Q. Duong
Xiangqiang GaoYingzhao ShaoYuanle WangHangyu ZhangYang Liu
Rana Muhammad SohaibOluwakayode OniretiYusuf SamboMohammad Rafiq SwashMuhammad Ali Imran
Pingchao WangHua ZhangJun-Bo Wang
Peng LinQingyang SongYao YuAbbas Jamalipour