JOURNAL ARTICLE

Hierarchical Asynchronous Federated Learning Algorithm for Edge Computing Networks

Aijun WenYunxi FuZesan LiuZhenya WangWenjuan Zhang

Year: 2025 Journal:   網際網路技術學刊 Pages: 617-629   Publisher: Taiwan Academic Network

Abstract

Federated learning aims to enable clients to jointly train a model with privacy without sharing the original data. Compared to centralized model training, federated learning introduces heterogeneous data among the distributed participants and communication bottlenecks problems. This article proposes a hierarchical Bayesian federated learning approach to achieve local model personalization and hierarchical model parameter aggregation, thereby addressing the heterogeneous data problem and reducing communication costs in federated learning. The variational inference method can effectively solve the heterogeneous data problem encountered by each participant in federated learning, demonstrating excellent robustness when handling different types of statistical heterogeneity problems, thereby effectively realizing the personalization of local models. Multilevel hierarchical model parameter aggregation and resource scheduling can also reduce communication costs in federated learning. Therefore, the hierarchical Bayesian federated learning framework proposed in this article controls the random variables of each participant’s local model with global variables, and the model construction process is completed hierarchically and collaboratively, realizing robustness improvement and communication optimization.

Keywords:

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.17
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Graph Neural Networks
Physical Sciences →  Computer Science →  Artificial Intelligence
Caching and Content Delivery
Physical Sciences →  Computer Science →  Computer Networks and Communications
© 2026 ScienceGate Book Chapters — All rights reserved.