Mengxue ShangDandan ZhangFengyin Li
Federated learning is a distributed learning paradigm for machine learning that has been widely studied and applied to a variety of scenarios. Since federated learning relies on only one central server to receive model updates from all clients, it has extremely high network bandwidth requirements and risks of single point of failure and privacy leakage. In order to prevent data leakage, this paper proposes a local data aggregation scheme based on xMK-CKKS. To realize decentralized services, this paper proposes a global model aggregation scheme based on RingAllreduce. Further, a decentralized distributed federated learning scheme based on multi-key homomorphic encryption is proposed to realize decentralized hierarchical federated learning with privacy protection. The security analysis and performance analysis show that the scheme in this paper is more scalable to support larger scale federated learning scenarios while ensuring data security, and is more robust to $k\lt N -1$ collusion between clients and distributed servers.
Yue XiaoYe YuXiyu ShengYang YouSotiris A. TegosGuoqiang XiaoGeorge K. KaragiannidisCarlo Fischione
Hemant Ramdas KumbharS. Srinivasa Rao
Qian ZhangShan JingChuan ZhaoBo ZhangZhenxiang Chen
Jing MaSi‐Ahmed NaasStephan SiggXixiang Lyu
Wenxiu DingHao GuoZheng YanMingjun Wang