Paul ZhengYao ZhuYulin HuZhengming ZhangAnke Schmeink
In federated learning (FL), local workers learn a global model collaboratively using their local data by communicating trained models to a central server for privacy concerns. Due to its local nature, FL is typically subject to various heterogeneities, including system and statistical heterogeneity. To address these concerns, Federated Proximal (FedProx) has been considered a promising FL paradigm to provide more stable learning convergence in the presence of computation stragglers and statistical heterogeneity. However, in wireless networks with unreliable communication channels, the errors of packet transmissions should be considered, introducing additional heterogeneity. For the first time, we rigorously prove the convergence of FedProx in the presence of transmission packet errors in heterogeneous networks. In addition, we propose a joint client selection and resource allocation strategy that maximizes the number of effective participating users for convergence acceleration. The method is combined with a random weight mechanism to reduce the statistical bias caused by the client selection strategy. An efficient low-complexity algorithm for solving the optimization problem is developed. The proposed method achieves faster convergence and requires fewer communication rounds to attain accuracy than existing state-of-the-art client selection methods.
Paul ZhengYao ZhuZhengming ZhangYulin HuAnke Schmeink
Zheshun WuZenglin XuDun ZengJunfan LiJie Liu
Xiongtao ZhangXiaomin ZhuJi WangHui YanHuangke ChenWeidong Bao
Jing CaoRan WeiQianyue CaoYongchun ZhengZongwei ZhuCheng JiXuehai Zhou