JOURNAL ARTICLE

Federated Learning With Client Selection and Gradient Compression in Heterogeneous Edge Systems

Yang XuZhida JiangHongli XuZhiyuan WangChen QianChunming Qiao

Year: 2023 Journal:   IEEE Transactions on Mobile Computing Vol: 23 (5)Pages: 5446-5461   Publisher: IEEE Computer Society

Abstract

Federated learning (FL) has recently gained tremendous attention in edge computing and Internet of Things, due to its capability of enabling distributed clients to cooperatively train models while keeping raw data locally. However, the existing works usually suffer from limited communication resources, dynamic network conditions and heterogeneous client properties, which hinder efficient FL. To simultaneously tackle the above challenges, we propose a heterogeneity-aware FL framework, called FedCG, with adaptive client selection and gradient compression. Specifically, FedCG introduces diversity to client selection and aims to select a representative client subset considering statistical heterogeneity. These selected clients are assigned different compression ratios based on heterogeneous and time-varying capabilities. After local training, they upload sparse model updates matching their capabilities for global aggregation, which can effectively reduce the communication cost and mitigate the straggler effect. More importantly, instead of naively combining client selection and gradient compression, we highlight that their decisions are tightly coupled and indicate the necessity of joint optimization. We theoretically analyze the impact of both client selection and gradient compression on convergence performance. Guided by the convergence rate, we develop an iteration-based algorithm to jointly optimize client selection and compression ratio decision using submodular maximization and linear programming. On this basis, we propose the quantized extension of FedCG, termed Q-FedCG, which further adjusts quantization levels based on gradient innovation. Extensive experiments on both real-world prototypes and simulations show that FedCG and its extension can provide up to 6.4× speedup.

Keywords:
Computer science Speedup Upload Selection (genetic algorithm) Gradient descent Edge device Distributed computing Selection algorithm Mathematical optimization Machine learning Artificial neural network Parallel computing

Metrics

25
Cited By
6.39
FWCI (Field Weighted Citation Impact)
69
Refs
0.96
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Mobile Crowdsensing and Crowdsourcing
Physical Sciences →  Computer Science →  Computer Science Applications
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.