Yuchang SunJiawei ShaoYuyi MaoJun Zhang
Federated edge learning (FEEL) has drawn much attention as a privacy-preserving distributed learning framework for mobile edge networks. In this work, we investigate a novel semi-decentralized FEEL (SD-FEEL) architecture where multiple edge servers collaborate to incorporate more data from edge devices in training. Despite the low training latency enabled by fast edge aggregation, the device heterogeneity in computational resources deteriorates the efficiency. This paper proposes an asynchronous training algorithm to overcome this issue in SD-FEEL, where edge servers are allowed to independently set deadlines for the associated client nodes and trigger the model aggregation. To deal with different levels of model staleness, we design a staleness-aware aggregation scheme and analyze its convergence. Simulation results demonstrate the effectiveness of our proposed algorithm in achieving faster convergence and better learning performance than synchronous training.
Yaqin LiZhicai ZhangFang FuYan Wang
Qian ZhuohaoMuhammad FirdausSiwan NohKyung-Hyune Rhee
Yuesheng LiangChangshan OuyangXunjun Chen
Yunming LiaoYang XuHongli XuMin ChenLun WangChunming Qiao
Ling LiCheng GuoXinyu TangKim-Kwang Raymond ChooYining Liu