With the rapid advancement of the Internet of Vehicles (IoV), IoV is facing the challenge of providing connectivity and high-quality services for vehicles. Mobile edge computing (MEC) is used to overcome the problem that vehicles may produce abundant data that cannot be processed in time. The MEC server has greater computing power than the vehicle. A model for offloading vehicle edge computing (VEC) tasks using deep reinforcement learning (DRL) techniques is presented. This work may be delegated to the VEC server's high computer capability because of the vehicle itself. The VEC server uses the resource reservation server (RRS) as the resource reservation of the VEC server. We combine reinforcement learning (RL) with deep learning (DL) to make better the convergence efficiency, we also use the Actor Critical algorithm in order to improve the training efficiency of the model. The simulation results show that our DRL algorithm based on Actor Critic can effectively improve the efficiency of VEC servers and reduce vehicle costs.
Bingxin WangXueqi YuanYuqing Chai
Qingman ZhangYanzhu GonT. Y. Xing