Edge computing has emerged as a critical infrastructure for deploying AI-driven microservices, particularly for applications requiring low-latency and high-performance, such as real-time analytics, autonomous systems, and intelligent transportation. However, the dynamic nature of edge environments, characterized by fluctuating network conditions and limited computational resources, presents significant challenges for efficient service orchestration. This study proposes an Adaptive Orchestration Algorithm (AOA) that dynamically optimizes microservice placement and resource allocation in real-time, balancing operational costs and Quality of Service (QoS) requirements. By continuously monitoring resource availability, network conditions, and service demand, the AOA adjusts microservices to maintain low latency, high availability, and efficient resource utilization. The algorithm is evaluated across various test cases simulating real-world edge scenarios, including resource fluctuations, network dynamics, and service demand spikes. Results demonstrate that the AOA significantly reduces latency, improves resource utilization, enhances energy efficiency, and offers superior adaptability compared to traditional static and heuristic-based orchestration approaches. This study highlights the AOA’s effectiveness in ensuring resilient and cost-efficient orchestration of AI microservices in dynamic edge environments.
Muhammad AlamJoão RufinoJoaquim FerreiraSyed Hassan AhmedSyed Bilal Hussain ShahYuanfang Chen
Gabriele MorabitoAnnamaria FicaraAntonio CelestiMassimo VillariMaria Fazio