JOURNAL ARTICLE

Edge/Cloud Infinite-Time Horizon Resource Allocation for Distributed Machine Learning and General Tasks

Ippokratis SartzetakisPolyzois SoumplisPanagiotis PantazopoulosKonstantinos V. KatsarosVasilis SourlasEmmanouel Varvarigos

Year: 2023 Journal:   IEEE Transactions on Network and Service Management Vol: 21 (1)Pages: 697-713   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Edge computing has emerged as a computing paradigm where the application and data processing takes place close to the end devices. It decreases the distances over which data transfers are made, offering reduced delay and fast speed of action for general data processing and store/retrieve jobs. The benefits of edge computing can also be reaped for distributed computation algorithms, where the cloud also plays an assistive role. In this context, an important challenge is to allocate the required resources at both edge and cloud to carry out the processing of data that are generated over a continuous (“infinite”) time horizon. This is a complex problem due to the variety of requirements (resource needs, accuracy, delay, etc.) that may be posed by each computation algorithm, as well as the heterogeneous resources’ features (e.g., processing, bandwidth). In this work, we develop a solution for serving weakly coupled general distributed algorithms, with emphasis on machine learning algorithms, at the edge and/or the cloud. We present a dual-objective Integer Linear Programming formulation that optimizes monetary cost and computation accuracy. We also introduce efficient heuristics to perform the resource allocation. We examine various distributed ML allocation scenarios using realistic parameters from actual vendors. We quantify trade-offs related to accuracy, performance and cost of edge/cloud bandwidth and processing resources. Our results indicate that among the many parameters of interest, the processing costs seem to play the most important role for the allocation decisions. Finally, we explore interesting interactions between target accuracy, monetary cost and delay.

Keywords:
Computer science Cloud computing Heuristics Distributed computing Edge computing Resource allocation Enhanced Data Rates for GSM Evolution Computation Bandwidth allocation Bandwidth (computing) Algorithm Artificial intelligence Computer network

Metrics

6
Cited By
2.64
FWCI (Field Weighted Citation Impact)
41
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

IoT and Edge/Fog Computing
Physical Sciences →  Computer Science →  Computer Networks and Communications
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Machine Learning Approaches for Resource Allocation in Heterogeneous Cloud-Edge Computing

Ramesh Krishna Mahimalur

Journal:   International Journal of Scientific Research in Computer Science Engineering and Information Technology Year: 2025 Vol: 11 (2)Pages: 2739-2748
JOURNAL ARTICLE

Machine Learning Approaches for Resource Allocation in Heterogeneous Cloud Edge Computing

Mahimalur, Ramesh

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2025
JOURNAL ARTICLE

Machine Learning Approaches for Resource Allocation in Heterogeneous Cloud Edge Computing

Mahimalur, Ramesh

Journal:   Zenodo (CERN European Organization for Nuclear Research) Year: 2025
© 2026 ScienceGate Book Chapters — All rights reserved.