JOURNAL ARTICLE

Neuron Specific Pruning for Communication Efficient Federated Learning

G. K. Sudhina KumarDurga Toshniwal

Year: 2022 Journal:   Proceedings of the 31st ACM International Conference on Information & Knowledge Management Pages: 4148-4152

Abstract

Federated Learning (FL) is a distributed training framework where a model is collaboratively trained over a set of clients without communicating their private data to the central server. However, each client shares the parameters of its local model. The first challenge faced by the FL is high communication cost due to the size of Deep Neural Network (DNN) models. Pruning is an efficient technique to reduce the number of parameters in DNN models, in which insignificant neurons are removed from the model. This paper introduces a federated pruning method based on Neuron Importance Scope Propagation (NISP) algorithm. The importance scores of output layer neurons are back-propagated layer-wise to every neuron in the network. The central server iteratively broadcasts the sparsified weights to all selected clients. Then, each participating client intermittently downloads the mask vector and reconstructs the weights in their original form. The locally updated model is pruned using the mask vector and shared with the server. After receiving model updates from each participating client, the server reconstructs and aggregates the weights. Experiments on MNIST and CIFAR10 datasets demonstrate that the proposed approach achieves accuracy close to Federated Averaging (FedAvg) algorithm with less communication cost.

Keywords:
Pruning Computer science MNIST database Scope (computer science) Layer (electronics) Set (abstract data type) Artificial neural network Artificial intelligence Server Machine learning Computer network

Metrics

7
Cited By
0.82
FWCI (Field Weighted Citation Impact)
8
Refs
0.72
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Adversarial Robustness in Machine Learning
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

FedNISP: Neuron Importance Scope Propagation pruning for communication efficient federated learning

Gaurav KumarDurga Toshniwal

Journal:   Computers & Electrical Engineering Year: 2024 Vol: 118 Pages: 109349-109349
BOOK-CHAPTER

Communication-Efficient Federated Learning with Model Pruning

Min-Kuan ChangYu-Wei ChanTing-En Wu

Lecture notes in electrical engineering Year: 2023 Pages: 67-76
BOOK-CHAPTER

Communication Efficient Reinforcement Learning-Based Federated Pruning

Weishan ZhangJiakai WangYuming NieHongwei ZhaoYuru LiuHaoyun SunTao ChenBaoyu Zhang

Lecture notes of the Institute for Computer Sciences, Social Informatics and Telecommunications Engineering Year: 2025 Pages: 3-17
JOURNAL ARTICLE

Communication-efficient federated learning via personalized filter pruning

Qi MinFei LuoWenbo DongChunhua GuWeichao Ding

Journal:   Information Sciences Year: 2024 Vol: 678 Pages: 121030-121030
© 2026 ScienceGate Book Chapters — All rights reserved.