JOURNAL ARTICLE

Fully Decentralized Client Selection for Energy-Efficient Federated Learning

Abstract

<p>Federated learning is the most famous algorithm for training machine learning models over distributed datasets without sharing the local data with a central server. However, in many real-life settings, not all clients are equally beneficial for training the model. The selection of the optimal subset of participating clients is critical for ensuring high performance and low energy consumption. This paper formulates the client selection problem as a multi-agent optimization task with the goal of finding a trade-off between performance and energy consumption. In particular, we propose a fully decentralized client selection policy based on non-stationary multi-armed bandits where clients autonomously decide to participate or not in the training process without relying on the central server. The proposed solution outperforms the random client selection policy reducing up to 12% both the number of rounds required to achieve the target accuracy and energy consumption.</p>

Keywords:
Computer science Selection (genetic algorithm) Energy consumption Federated learning Task (project management) Process (computing) Distributed computing Selection algorithm Energy (signal processing) Server Machine learning Artificial intelligence Computer network Engineering

Metrics

1
Cited By
0.26
FWCI (Field Weighted Citation Impact)
21
Refs
0.60
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Age of Information Optimization
Physical Sciences →  Computer Science →  Computer Networks and Communications
IoT and Edge/Fog Computing
Physical Sciences →  Computer Science →  Computer Networks and Communications
© 2026 ScienceGate Book Chapters — All rights reserved.