JOURNAL ARTICLE

FED-SDS: Adaptive Structured Dynamic Sparsity for Federated Learning Under Heterogeneous Clients

Abstract

Federated Learning (FL) is a widely utilized distributed learning methodology that facilitates real-time continuous learning while preserving client privacy. In most FL implementations, it is assumed that all edge clients possess sufficient computational capabilities to participate in the training of a Deep Neural Network (DNN) model. However, in practical applications, some clients may have limited resources and can only train a significantly smaller local model. To address system heterogeneity, this paper introduces Fed-SDS, an approach that adaptively tailors sparsity strategies for local models. In comparison to existing sparse FL schemes, Fed-SDS improves convergence and enhances model accuracy through a novel channel-wise sparsity metric, namely Mean Weight Magnitude with Gradient (MWMG). In our experiments, we compared Fed-SDS with other sparse FL methods. Empirical results demonstrate that while other sparse methods can significantly impact convergence, Fed-SDS can achieve the highest task accuracies and convergence speed in various system and data heterogeneity scenarios.

Keywords:
Computer science Federated learning Convergence (economics) Implementation Metric (unit) Task (project management) Enhanced Data Rates for GSM Evolution Machine learning Artificial neural network Artificial intelligence Edge device Data mining

Metrics

3
Cited By
1.92
FWCI (Field Weighted Citation Impact)
28
Refs
0.81
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Internet Traffic Analysis and Secure E-voting
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.