JOURNAL ARTICLE

Reinforcement learning for resource provisioning in the vehicular cloud

Mohammad A. SalahuddinAla Al‐FuqahaMohsen Guizani

Year: 2016 Journal:   IEEE Wireless Communications Vol: 23 (4)Pages: 128-135   Publisher: Institute of Electrical and Electronics Engineers

Abstract

This article presents a concise view of vehicular clouds that incorporates various vehicular cloud models, which have been proposed, to date. Essentially, they all extend the traditional cloud and its utility computing functionalities across the entities in the vehicular ad hoc network (VANET). These entities include fixed road-side units (RSUs), on-board units (OBUs) embedded in the vehicle and personal smart devices of the driver and passengers. Cumulatively, these entities yield abundant processing, storage, sensing and communication resources. However, vehicular clouds require novel resource provisioning techniques, which can address the intrinsic challenges of (i) dynamic demands for the resources and (ii) stringent QoS requirements. In this article, we show the benefits of reinforcement learning based techniques for resource provisioning in the vehicular cloud. The learning techniques can perceive long term benefits and are ideal for minimizing the overhead of resource provisioning for vehicular clouds.

Keywords:
Computer science Reinforcement learning Provisioning Cloud computing Resource (disambiguation) Computer network Resource management (computing) Distributed computing Artificial intelligence Operating system

Metrics

105
Cited By
4.79
FWCI (Field Weighted Citation Impact)
14
Refs
0.96
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Vehicular Ad Hoc Networks (VANETs)
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
IoT and Edge/Fog Computing
Physical Sciences →  Computer Science →  Computer Networks and Communications
Caching and Content Delivery
Physical Sciences →  Computer Science →  Computer Networks and Communications
© 2026 ScienceGate Book Chapters — All rights reserved.