JOURNAL ARTICLE

Decentralized Distributed Federated Learning Based on Multi-Key Homomorphic Encryption

Abstract

Federated learning is a distributed learning paradigm for machine learning that has been widely studied and applied to a variety of scenarios. Since federated learning relies on only one central server to receive model updates from all clients, it has extremely high network bandwidth requirements and risks of single point of failure and privacy leakage. In order to prevent data leakage, this paper proposes a local data aggregation scheme based on xMK-CKKS. To realize decentralized services, this paper proposes a global model aggregation scheme based on RingAllreduce. Further, a decentralized distributed federated learning scheme based on multi-key homomorphic encryption is proposed to realize decentralized hierarchical federated learning with privacy protection. The security analysis and performance analysis show that the scheme in this paper is more scalable to support larger scale federated learning scenarios while ensuring data security, and is more robust to $k\lt N -1$ collusion between clients and distributed servers.

Keywords:
Computer science Homomorphic encryption Scalability Server Collusion Single point of failure Distributed computing Federated learning Encryption Information privacy Scheme (mathematics) Distributed learning Computer network Computer security Database

Metrics

3
Cited By
0.77
FWCI (Field Weighted Citation Impact)
22
Refs
0.74
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Cryptography and Data Security
Physical Sciences →  Computer Science →  Artificial Intelligence
Internet Traffic Analysis and Secure E-voting
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.