JOURNAL ARTICLE

Federated matched averaging with information-gain based parameter sampling

Abstract

Federated learning allows edge devices to learn a shared global model from the client’s model parameters while keeping the training data on the device. However, for large models, transmitting all model parameters imposes considerable communication costs, which can be a significant bottleneck in bandwidth-constrained deployments. We present a federated matched averaging algorithm with information-gain based sampling that considerably reduces the number of parameters to be sent by all clients in a federated learning paradigm. Experiments across five benchmark datasets, encompassing symbolic, image and audio data, suggest that our algorithm significantly reduces communication overhead in federated learning settings, without reducing classification accuracy.

Keywords:
Computer science Bottleneck Federated learning Benchmark (surveying) Bandwidth (computing) Overhead (engineering) Enhanced Data Rates for GSM Evolution Machine learning Artificial intelligence Sampling (signal processing) Data mining Computer network Telecommunications

Metrics

7
Cited By
0.56
FWCI (Field Weighted Citation Impact)
15
Refs
0.73
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Mobile Crowdsensing and Crowdsourcing
Physical Sciences →  Computer Science →  Computer Science Applications
Internet Traffic Analysis and Secure E-voting
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.