JOURNAL ARTICLE

Federated Learning with Matched Averaging

Abstract

Federated learning allows edge devices to collaboratively learn a shared model while keeping the training data on device, decoupling the ability to do model training from the need to store the data in the cloud. We propose Federated matched averaging (FedMA) algorithm designed for federated learning of modern neural network architectures e.g. convolutional neural networks (CNNs) and LSTMs. FedMA constructs the shared global model in a layer-wise manner by matching and averaging hidden elements (i.e. channels for convolution layers; hidden states for LSTM; neurons for fully connected layers) with similar feature extraction signatures. Our experiments indicate that FedMA not only outperforms popular state-of-the-art federated learning algorithms on deep CNN and LSTM architectures trained on real world datasets, but also reduces the overall communication burden.

Keywords:
Computer science Convolutional neural network Federated learning Artificial intelligence Convolution (computer science) Deep learning Decoupling (probability) Layer (electronics) Edge device Feature extraction Machine learning Cloud computing Feature (linguistics) Artificial neural network Pattern recognition (psychology)

Metrics

102
Cited By
0.00
FWCI (Field Weighted Citation Impact)
18
Refs
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Traffic Prediction and Management Techniques
Physical Sciences →  Engineering →  Building and Construction
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.