JOURNAL ARTICLE

FedStar: Efficient Federated Learning on Heterogeneous Communication Networks

Jing CaoRan WeiQianyue CaoYongchun ZhengZongwei ZhuCheng JiXuehai Zhou

Year: 2023 Journal:   IEEE Transactions on Computer-Aided Design of Integrated Circuits and Systems Vol: 43 (6)Pages: 1848-1861   Publisher: Institute of Electrical and Electronics Engineers

Abstract

The proliferation of multi-media applications and increased computing power of mobile devices have led to the development of personalized artificial intelligent (AI) applications that utilize the massive user-information residing on them. However, the traditional centralized training paradigm is not applicable in this scenario due to potential privacy risks and high communication overhead. Federated learning (FL) provides an option to these applications. Nevertheless, the heterogeneity of computing and communication latency among devices have posed great challenges to building efficient learning frameworks. Existing optimizations on FL either fail to speed up training on heterogeneous devices or suffer from poor communication efficiency. In this paper, we propose FedStar, an efficient FL framework that supports decentralized asynchronous training on heterogeneous communication networks. Considering the heterogeneous computing power in the network, FedStar supports running heterogeneity-aware local steps on each device. What's more, considering the heterogeneous communication latency and possibly unreachable communication path between some devices, FedStar generates a decentralized communication topology that can achieve maximal training throughput. Finally, it adopts weighted aggregation to guarantee high convergence accuracy of global model. Theoretical analysis results show the convergence behaviour of FedStar under non-convex settings. Experimental results show that FedStar can achieve a speedup of 4.81× than the state-of-the-art FL schemes with high convergence accuracy.

Keywords:
Computer science Asynchronous communication Distributed computing Latency (audio) Overhead (engineering) Speedup Convergence (economics) Symmetric multiprocessor system Heterogeneous network Low latency (capital markets) Computer network Wireless Wireless network Parallel computing Telecommunications

Metrics

6
Cited By
1.53
FWCI (Field Weighted Citation Impact)
65
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Wireless Communication Technologies
Physical Sciences →  Engineering →  Electrical and Electronic Engineering
Internet Traffic Analysis and Secure E-voting
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Communication-Efficient Federated Learning for Heterogeneous Clients

Ying LiXingwei WangHaodong LiPraveen Kumar DontaMin HuangSchahram Dustdar

Journal:   ACM Transactions on Internet Technology Year: 2025 Vol: 25 (2)Pages: 1-37
JOURNAL ARTICLE

FedHe: Heterogeneous Models and Communication-Efficient Federated Learning

Yun Hin ChanEdith C.‐H. Ngai

Journal:   2021 17th International Conference on Mobility, Sensing and Networking (MSN) Year: 2021 Pages: 207-214
JOURNAL ARTICLE

Federated Learning in Heterogeneous Networks With Unreliable Communication

Paul ZhengYao ZhuYulin HuZhengming ZhangAnke Schmeink

Journal:   IEEE Transactions on Wireless Communications Year: 2023 Vol: 23 (4)Pages: 3823-3838
JOURNAL ARTICLE

Federated Learning in Heterogeneous Networks with Unreliable Communication

Paul ZhengYao ZhuZhengming ZhangYulin HuAnke Schmeink

Journal:   2021 IEEE Globecom Workshops (GC Wkshps) Year: 2021 Pages: 1-6
© 2026 ScienceGate Book Chapters — All rights reserved.