Chaogang TangHuaming WuRuidong LiJoel J. P. C. Rodrigues
In Vehicular Edge Computing (VEC) environments, the increasingly complicated functional and non-functional requirements from vehicular applications such as MetaVehicles usually incur larger sizes of task-input data, which not only increase the transmission delay of task-input data via the front-haul links but also degrade the quality of experience for users, even if computation tasks can be offloaded and executed at the network edge. In this article, we put forward a caching-enabled task offloading strategy, by caching and reusing the universal context data at the edge server, to avoid duplicated data transmission in VEC systems. The goal is to minimize the overall response latency for all the tasks, by jointly optimizing task offloading, content caching, and resource allocation decisions in VEC. The optimization problem is formulated as a Mixed-Integer Nonlinear Programming (MINLP) problem. To efficiently solve this problem, we decompose this problem into two subproblems, namely, the computing Resource Allocation (RA) problem and the Joint Offloading and Caching (JOC) problem. The corresponding algorithms are put forward to solve the content caching and task offloading problems, respectively. Numeric evaluation reveals that our strategies and algorithms can achieve better performance in minimizing the overall response latency, in comparison with other approaches.
Zhixiong ChenZhengchuan ChenZhi RenLiang LiangWanli WenYunjian Jia
Haochen TangHaixia ZhangDongfeng YuanMinggao Zhang
Kaige TanLei FengGyörgy DánMartin Törngren