JOURNAL ARTICLE

UNIDEAL: Curriculum Knowledge Distillation Federated Learning

Abstract

Federated Learning (FL) has emerged as a promising approach to enable collaborative learning among multiple clients while preserving data privacy. However, cross-domain FL tasks, where clients possess data from different domains or distributions, remain a challenging problem due to the inherent heterogeneity. In this paper, we present UNIDEAL, a novel FL algorithm specifically designed to tackle the challenges of cross-domain scenarios and heterogeneous model architectures. The proposed method introduces Adjustable Teacher-Student Mutual Evaluation Curriculum Learning, which significantly enhances the effectiveness of knowledge distillation in FL settings. We conduct extensive experiments on various datasets, comparing UNIDEAL with state-of-the-art baselines. Our results demonstrate that UNIDEAL achieves superior performance in terms of both model accuracy and communication efficiency. Additionally, we provide a convergence analysis of the algorithm, showing a convergence rate of $O\left( {\frac{1}{T}} \right)$ under non-convex conditions.

Keywords:
Federated learning Computer science Convergence (economics) Curriculum Distillation Domain (mathematical analysis) Rate of convergence Machine learning Regular polygon Artificial intelligence Data mining Computer network Mathematics

Metrics

6
Cited By
3.83
FWCI (Field Weighted Citation Impact)
33
Refs
0.90
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.