JOURNAL ARTICLE

Asynchronous Subgradient-push Algorithm for Distributed optimization over Directed Graphs

Abstract

This paper proposes an asynchronous subgradient-push algorithm (AsySPA) to solve an additive cost optimization problem over digraphs where each node only has access to a local convex objective function. The striking feature of our asynchronous algorithm is that every node updates independently, without any coordination with other nodes by using local information. Thus, no global clock or synchronization scheme is needed and nodes can update their iterates under different rates. To address asynchrony among nodes and communication delays, we design a novel mechanism to adaptively adjust stepsizes per update in each node and construct a delay-free augmented system. Moreover, we propose a generalized subgradient algorithm, which provides a unified approach to study the standard subgradient algorithm and the celebrated incremental subgradient algorithm, to prove its exact convergence. Finally, we demonstrate the efficiency and advantages of the AsySPA against the current state-of-the-art by training a multi-class logistic regression classifier on a large real dataset.

Keywords:
Subgradient method Computer science Asynchronous communication Node (physics) Convex optimization Iterated function Mathematical optimization Algorithm Regular polygon Mathematics Computer network Machine learning

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
36
Refs
0.15
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Distributed Control Multi-Agent Systems
Physical Sciences →  Computer Science →  Computer Networks and Communications
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Memory and Neural Computing
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.