JOURNAL ARTICLE

DRL-cloud: deep reinforcement learning-based resource provisioning and task scheduling for cloud service providers

Mingxi ChengJi LiShahin Nazarian

Year: 2018 Journal:   Asia and South Pacific Design Automation Conference Pages: 129-134

Abstract

Cloud computing has become an attractive computing paradigm in both academia and industry. Through virtualization technology, Cloud Service Providers (CSPs) that own data centers can structure physical servers into Virtual Machines (VMs) to provide services, resources, and infrastructures to users. Profit-driven CSPs charge users for service access and VM rental, and reduce power consumption and electric bills so as to increase profit margin. The key challenge faced by CSPs is data center energy cost minimization. Prior works proposed various algorithms to reduce energy cost through Resource Provisioning (RP) and/or Task Scheduling (TS). However, they have scalability issues or do not consider TS with task dependencies, which is a crucial factor that ensures correct parallel execution of tasks. This paper presents DRL-Cloud, a novel Deep Reinforcement Learning (DRL)-based RP and TS system, to minimize energy cost for large-scale CSPs with very large number of servers that receive enormous numbers of user requests per day. A deep Q-learning-based two-stage RP-TS processor is designed to automatically generate the best long-term decisions by learning from the changing environment such as user request patterns and realistic electric price. With training techniques such as target network, experience replay, and exploration and exploitation, the proposed DRL-Cloud achieves remarkably high energy cost efficiency, low reject rate as well as low runtime with fast convergence. Compared with one of the state-of-the-art energy efficient algorithms, the proposed DRL-Cloud achieves up to 320% energy cost efficiency improvement while maintaining lower reject rate on average. For an example CSP setup with 5,000 servers and 200,000 tasks, compared to a fast round-robin baseline, the proposed DRL-Cloud achieves up to 144% runtime reduction.

Keywords:
Computer science Cloud computing Provisioning Server Reinforcement learning Scalability Distributed computing Virtualization Efficient energy use Scheduling (production processes) Computer network Artificial intelligence Operating system Engineering

Metrics

77
Cited By
28.31
FWCI (Field Weighted Citation Impact)
31
Refs
1.00
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Cloud Computing and Resource Management
Physical Sciences →  Computer Science →  Information Systems
IoT and Edge/Fog Computing
Physical Sciences →  Computer Science →  Computer Networks and Communications
Software-Defined Networks and 5G
Physical Sciences →  Computer Science →  Computer Networks and Communications

Related Documents

JOURNAL ARTICLE

DRL-cloud: Deep reinforcement learning-based resource provisioning and task scheduling for cloud service providers

Mingxi ChengJi LiShahin Nazarian

Journal:   2018 23rd Asia and South Pacific Design Automation Conference (ASP-DAC) Year: 2018 Pages: 129-134
JOURNAL ARTICLE

Deep Reinforcement Learning for Cloud Resource Provisioning

Ravikumar Perumallaplli

Journal:   SSRN Electronic Journal Year: 2025
JOURNAL ARTICLE

EDGECLOUD-DRL: A DEEP REINFORCEMENT LEARNING-BASED TASK SCHEDULING FRAMEWORK FOR EDGE-CLOUD COMPUTING

Mohammed Waseem Ahme

Journal:   International Journal of Apllied Mathematics Year: 2025 Vol: 38 (6s)Pages: 839-866
JOURNAL ARTICLE

Integrating Deep Reinforcement Learning (DRL) with GAACO for Resource Scheduling in Cloud Computing

Fatma Rjab Almasre

Journal:   African Journal of Advanced Pure and Applied Sciences Year: 2025 Pages: 344-353
© 2026 ScienceGate Book Chapters — All rights reserved.