Fangtong ZhouZhibin WangXiliang LuoYong Zhou
Communication bottleneck and statistical heterogeneity are two critical challenges of federated learning (FL) over wireless networks. To tackle both challenges, in this paper we propose an over-the-air computation (AirComp) assisted hierarchical personalized FL (HPFL) framework, where a device-edge-cloud based three-tier network architecture is adopted to simultaneously learn a global model and multiple personalized local models. We analyze the convergence of the AirComp-assisted HPFL framework and formulate an optimization problem to minimize the transmission distortion, which is an essential component of the convergence upper bound. An efficient algorithm is subsequently developed to optimize the transceiver design by leveraging successive convex approximation and Lagrangian duality. We conduct extensive simulations to demonstrate that our developed algorithm achieves a near-optimal performance and a much greater test accuracy than the baseline algorithms.
Fangtong ZhouZhibin WangHangguan ShanLiantao WuXiaohua TianYuanming ShiYong Zhou
Danni ChenMing LeiMinjian ZhaoAn LiuSikai Sheng
Zeshen LiZihan ChenTony Q. S. QuekHoward H. Yang
Ge GaoQiaochu AnZhibin WangZixin WangYuanming ShiYong Zhou