The rapidly expanding Internet of Vehicles (IoV) poses many challenges, such as the difficulty of providing ubiquitous connectivity and best-in-class services to many vehicles, and the fact that vehicles can generate large amounts of time-sensitive and computationally expensive data. These data cannot be processed in a timely manner because the vehicles lack the computational power of MEC servers due to the limitations of the vehicles' computational power and battery capacity. In this study, we propose a task offloading method for vehicle edge computing (VEC) based on deep reinforcement learning(DRL), which combines reinforcement learning (RL) with deep learning (DL) and can transfer computationally demanding tasks such as data processing to a VEC server with more processing power, and, we use the Actor-Critic algorithm, which transforms the value function table maintenance into the training of neural network models to improve the convergence efficiency and the training effect of the models. Simulation results show that our proposed Actor-Critic based DRL algorithm may significantly improve the effectiveness of VEC servers and reduce the vehicle cost.
Wenxiu XuNingjiang ChenHuan Tu
Sifeng ZhuYuhu YangHai ZhuRuin Qiao