Abstract

Cloud Computing systems needs to ensure the distribution of resources for processing services within acceptable levels of Quality of Service - QoS. One problem that exists in this context is how to use available computational resources avoiding bottlenecks and wastage. This paper aims to determine if there is a statistical method able to predict the amount of resources that should be used for new virtual servers. Thus, we used a time series model named ARIMA (autoregressive moving average) as underlying technique for provisioning of new virtual machines in Cloud Computing environments. The results evidence the feasibility of use of such models in this context.

Keywords:
Provisioning Cloud computing Computer science Series (stratigraphy) Time series Distributed computing Operating system Machine learning Geology

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
6
Refs
0.17
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Cloud Computing and Resource Management
Physical Sciences →  Computer Science →  Information Systems
IoT and Edge/Fog Computing
Physical Sciences →  Computer Science →  Computer Networks and Communications
Data Stream Mining Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.