JOURNAL ARTICLE

An Unsupervised Federated Domain Adaptation Method Based on Knowledge Distillation

Yunpeng XiaoYing GuoHaipeng ZhuChaolong JiaQian LiRong WangGuoyin Wang

Year: 2024 Journal:   IEEE Transactions on Neural Networks and Learning Systems Vol: 36 (6)Pages: 10993-11007   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Conventional unsupervised multi source domain adaptation (UMDA) methods are based on the assumption that all source domain data are accessible directly. Aiming at the problem that current UMDA methods cannot directly obtain source domain data in federated learning (FL), a knowledge distillation-based multisource domain adaptation method adapted to FL is proposed. First of all, considering that knowledge distillation allows learning solely through model access, this article adopts an improved voting mechanism by applying a smoothing technique to the confidence distribution in the source domain models. This reduces the influence of models with extreme high confidence, thereby extracting high-quality consensus knowledge. Second, this article designs a teacher model adaptive weighting strategy. It identifies irrelevant domains and malicious domains according to the similarity between consensus knowledge and the output of the teacher model. Then, it improves the robustness of the model for negative transfer. Finally, this article introduces the idea of contrastive learning. It can control the drift of a single source domain and bridge the deviation between the representation learned by the local model and the global model. Experiments show that the method proposed in this article is superior to the mainstream UMDA methods. Moreover, it is robust to negative transfer, which is suitable for many practical FL applications.

Keywords:
Adaptation (eye) Domain adaptation Computer science Distillation Domain (mathematical analysis) Artificial intelligence Domain knowledge Psychology Mathematics Chromatography Chemistry Neuroscience

Metrics

4
Cited By
2.56
FWCI (Field Weighted Citation Impact)
50
Refs
0.87
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Domain Adaptation and Few-Shot Learning
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.