JOURNAL ARTICLE

FedDyn: A dynamic and efficient federated distillation approach on Recommender System

Abstract

Federated Learning (FL) is a popular distributed machine learning paradigm that enables devices to work together to train a centralized model without transmitting raw data. However, when the model becomes complex, mobile devices' communication overhead can be unacceptably large in traditional FL methods. To address this problem, Federated Distillation (FD) is proposed as a federated version of knowledge distillation. Most of the recent FD methods calculate the model output (logits) of each client as the local knowledge on a public proxy dataset and do distillation with the average of the clients' logits on the server side. Nevertheless, these FD methods are not robust and perform poorly in the non-IID (data is nonindependent and non-identically distributed) scenario such as Federated Recommendation (FR). In order to eliminate the non-IID problem and apply FD in FR, we proposed a novel method named FedDyn to construct a proxy dataset and extract local knowledge dynamically in this paper. In this method, we replaced the average strategy with focus distillation to strengthen reliable knowledge, which solved the non-IID problem that the local model has biased knowledge. The average strategy is a dilution and perturbation of knowledge since it treats reliable and unreliable knowledge equally important. In addition, to prevent inference of private user information from local knowledge, we used a method like local differential privacy techniques to protect this knowledge on the client side. The experimental results showed that our method has a faster convergence speed and lower communication overhead than the baselines on three datasets, including MovieLens-10OK, MovieLens-IM and Pinterest.

Keywords:
Computer science Overhead (engineering) Distillation Independent and identically distributed random variables Recommender system Inference Data mining MovieLens Machine learning Artificial intelligence Collaborative filtering

Metrics

25
Cited By
6.39
FWCI (Field Weighted Citation Impact)
50
Refs
0.96
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.