JOURNAL ARTICLE

AsyncFedKD: Asynchronous Federated Learning with Knowledge Distillation

Abstract

Federated learning (FL) allows for the decentralized training of a global model on edge devices without transferring data samples, thus preserving privacy. Due to the ubiquitous wearable devices and mobile devices with health applications, FL has shown promise in the medical field for applications such as medical imaging, disease diagnosis, and electronic health record (EHR) analysis. However, slower edge devices with limited resources can slow down the training process. To address this issue and increase efficiency, we propose the use of Asynchronous Federated Learning with Knowledge Distillation (AsyncFedKD). AsyncFedKD asynchronously trains a lightweight global student model using a pre-trained teacher model, preventing a decrease in training efficiency due to slow edge devices. The knowledge distillation aspect of AsyncFedKD effectively compresses the size of model parameters for efficient communication during training. AsyncFedKD has been tested on a sensitive mammography cancer dataset and achieved an accuracy of 88% on the global model.

Keywords:
Computer science Asynchronous communication Wearable computer Enhanced Data Rates for GSM Evolution Edge computing Process (computing) Distillation Edge device Wearable technology Federated learning Artificial intelligence Field (mathematics) Machine learning Human–computer interaction Multimedia Embedded system Computer network Cloud computing

Metrics

1
Cited By
0.26
FWCI (Field Weighted Citation Impact)
2
Refs
0.60
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
AI in cancer detection
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.