Aijun WenYunxi FuZesan LiuZhenya WangWenjuan Zhang
Federated learning aims to enable clients to jointly train a model with privacy without sharing the original data. Compared to centralized model training, federated learning introduces heterogeneous data among the distributed participants and communication bottlenecks problems. This article proposes a hierarchical Bayesian federated learning approach to achieve local model personalization and hierarchical model parameter aggregation, thereby addressing the heterogeneous data problem and reducing communication costs in federated learning. The variational inference method can effectively solve the heterogeneous data problem encountered by each participant in federated learning, demonstrating excellent robustness when handling different types of statistical heterogeneity problems, thereby effectively realizing the personalization of local models. Multilevel hierarchical model parameter aggregation and resource scheduling can also reduce communication costs in federated learning. Therefore, the hierarchical Bayesian federated learning framework proposed in this article controls the random variables of each participant’s local model with global variables, and the model construction process is completed hierarchically and collaboratively, realizing robustness improvement and communication optimization.
Qimei ChenZehua YouJing WuYunpeng LiuHao Jiang
Chaoqun YouKun GuoHoward H. YangTony Q. S. Quek
Guozeng XuXiuhua LiHui LiQilin FanXiaofei WangVictor C. M. Leung