JOURNAL ARTICLE

Distributed Learning over Unreliable Networks

Abstract

Most of today's distributed machine learning systems assume {\em reliable networks}: whenever two machines exchange information (e.g., gradients or models), the network should guarantee the delivery of the message. At the same time, recent work exhibits the impressive tolerance of machine learning algorithms to errors or noise arising from relaxed communication or synchronization. In this paper, we connect these two trends, and consider the following question: {\em Can we design machine learning systems that are tolerant to network unreliability during training?} With this motivation, we focus on a theoretical problem of independent interest---given a standard distributed parameter server architecture, if every communication between the worker and the server has a non-zero probability $p$ of being dropped, does there exist an algorithm that still converges, and at what speed? The technical contribution of this paper is a novel theoretical analysis proving that distributed learning over unreliable network can achieve comparable convergence rate to centralized or distributed learning over reliable networks. Further, we prove that the influence of the packet drop rate diminishes with the growth of the number of \textcolor{black}{parameter servers}. We map this theoretical result onto a real-world scenario, training deep neural networks over an unreliable network layer, and conduct network simulation to validate the system improvement by allowing the networks to be unreliable.

Keywords:
Computer science Server Distributed computing Artificial intelligence Synchronization (alternating current) Convergence (economics) Distributed learning Focus (optics) Machine learning Artificial neural network Network packet Network architecture Distributed algorithm Computer network

Metrics

21
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Age of Information Optimization
Physical Sciences →  Computer Science →  Computer Networks and Communications
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Distributed Sensor Networks and Detection Algorithms
Physical Sciences →  Computer Science →  Computer Networks and Communications

Related Documents

JOURNAL ARTICLE

Distributed Constraint-Coupled Optimization over Unreliable Networks

Mohammadreza DoostmohammadianUsman A. KhanAlireza Aghasi

Journal:   2022 10th RSI International Conference on Robotics and Mechatronics (ICRoM) Year: 2022 Pages: 371-376
JOURNAL ARTICLE

Communication Efficient Distributed Newton Method over Unreliable Networks

Wen MingChengchang LiuYuedong Xu

Journal:   Proceedings of the AAAI Conference on Artificial Intelligence Year: 2024 Vol: 38 (14)Pages: 15832-15840
JOURNAL ARTICLE

Toward Understanding Federated Learning over Unreliable Networks

Chenyuan FengAhmed ArafaZihan ChenMingxiong ZhaoTony Q. S. QuekHoward H. Yang

Journal:   IEEE Transactions on Machine Learning in Communications and Networking Year: 2024 Vol: 3 Pages: 80-97
© 2026 ScienceGate Book Chapters — All rights reserved.