JOURNAL ARTICLE

Deep reinforcement learning-based task offloading and service caching in vehicular edge computing

Abstract

As a promising solution, Vehicle Edge Computing (VEC) addresses the challenge of driverless technology to process computationally intensive tasks in a latency-sensitive manner by offloading the vehicle's computationally intensive tasks to MEC servers or the cloud. However, in situations with limited server resources, reducing service latency and improving the efficiency of service request processing remains a challenging task. To tackle this problem, we propose a joint task offloading and service caching framework aimed at minimizing the cost of unmanned vehicles. Initially, we formulate the problem as a mixed-integer nonlinear programming problem, subsequently transforming it into a solvable Partially Observable Markov Decision Process (POMDP) problem. We then design and introduce a task offloading algorithm framework based on DDQNL to address this problem. The performance of the proposed algorithm is validated through comparisons with other baseline algorithms.

Keywords:
Computer science Reinforcement learning Edge computing Task (project management) Enhanced Data Rates for GSM Evolution Service (business) Mobile edge computing Vehicular ad hoc network Computer network Edge device Human–computer interaction Distributed computing Server Artificial intelligence Operating system Wireless Wireless ad hoc network Cloud computing Engineering

Metrics

1
Cited By
0.84
FWCI (Field Weighted Citation Impact)
0
Refs
0.62
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Caching and Content Delivery
Physical Sciences →  Computer Science →  Computer Networks and Communications
IoT and Edge/Fog Computing
Physical Sciences →  Computer Science →  Computer Networks and Communications
Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.