JOURNAL ARTICLE

Decentralized Federated Learning via SGD over Wireless D2D Networks

Abstract

Federated Learning (FL), an emerging paradigm for fast intelligent acquisition at the network edge, enables joint training of a machine learning model over distributed data sets and computing resources with limited disclosure of local data. Communication is a critical enabler of large-scale FL due to significant amount of model information exchanged among edge devices. In this paper, we consider a network of wireless devices sharing a common fading wireless channel for the deployment of FL. Each device holds a generally distinct training set, and communication typically takes place in a Device-To-Device (D2D) manner. In the ideal case in which all devices within communication range can communicate simultaneously and noiselessly, a standard protocol that is guaranteed to converge to an optimal solution of the global empirical risk minimization problem under convexity and connectivity assumptions is Decentralized Stochastic Gradient Descent (DSGD). DSGD integrates local SGD steps with periodic consensus averages that require communication between neighboring devices. In this paper, wireless protocols are proposed that implement DSGD by accounting for the presence of path loss, fading, blockages, and mutual interference. The proposed protocols are based on graph coloring for scheduling and on both digital and analog transmission strategies at the physical layer, with the latter leveraging over-The-Air computing via sparsity-based recovery.

Keywords:
Computer science Distributed computing Computer network Stochastic gradient descent Wireless network Wireless Edge device Fading Scheduling (production processes) Channel (broadcasting) Cloud computing Machine learning Telecommunications Artificial neural network

Metrics

96
Cited By
9.84
FWCI (Field Weighted Citation Impact)
36
Refs
0.98
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Cooperative Communication and Network Coding
Physical Sciences →  Computer Science →  Computer Networks and Communications
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

DRACO: Decentralized Asynchronous Federated Learning Over Row-Stochastic Wireless Networks

Eunjeong JeongMarios Kountouris

Journal:   IEEE Open Journal of the Communications Society Year: 2025 Vol: 6 Pages: 4818-4839
JOURNAL ARTICLE

Performance Analysis for Resource Constrained Decentralized Federated Learning Over Wireless Networks

Zhigang YanDong Li

Journal:   IEEE Transactions on Communications Year: 2024 Vol: 72 (7)Pages: 4084-4100
JOURNAL ARTICLE

Communication-Efficient Decentralized Federated Learning for Generalization and Personalization Over Wireless Networks

J. ParkSunmin KimJoohyung LeeDusit Niyato

Journal:   IEEE Wireless Communications Letters Year: 2025 Vol: 14 (12)Pages: 4207-4211
© 2026 ScienceGate Book Chapters — All rights reserved.