JOURNAL ARTICLE

Communication-Efficient Federated Multi-Task Learning with Sparse Sharing

Abstract

Federated multi-task learning (FMTL) is a promising technology to deal with the severe data heterogeneity issue in federated learning (FL), where each client learns individual models locally and the server extracts similar model parameters from the tasks to keep personalization for models of clients. Hence, it is essential to precisely extract the model parameters shared among tasks. On the other aspect, the limitation of communication resources would also restrict the model transmission, and thus influence the FMTL performance. To address the above issues, we propose a novel FMTL with Sparse Sharing (FedSS) mechanism that allows clients to share model parameters dynamically according to diversified model structures under limited communication resources. Particularly, we present an adaptive quantization approach for task relevance, which serves as a metric to evaluate the extent of model sharing across tasks. The objective function is formulated to minimize the model transmission latency while ensure the FMTL learning performance via a joint bandwidth allocation and client selection strategy. Closed-form expressions for the optimal client selection and bandwidth allocation are derived based on a alternating direction method of multipliers (ADMM) algorithm. Numerical results show that the proposed FedSS outperforms the benchmarks, and achieves efficient communication performance.

Keywords:
Computer science Personalization Federated learning Performance metric Distributed computing Reinforcement learning Artificial intelligence Machine learning

Metrics

2
Cited By
0.51
FWCI (Field Weighted Citation Impact)
16
Refs
0.67
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Recommender Systems and Techniques
Physical Sciences →  Computer Science →  Information Systems
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence

Related Documents

JOURNAL ARTICLE

Accelerating Communication-Efficient Federated Multi-Task Learning With Personalization and Fairness

Renyou XieChaojie LiXiaojun ZhouZhaoyang Dong

Journal:   IEEE Transactions on Parallel and Distributed Systems Year: 2024 Vol: 35 (11)Pages: 2239-2253
JOURNAL ARTICLE

FedMT: Multi-Task Federated Learning with Competitive GPU Resource Sharing

Yu, YongboYu, FuxunXu, ZiruiWang, DiZhang, MinjiaLi, AngLiu, ChenChenTian, ZhiChen, Xiang

Journal:   Digital Repository at the University of Maryland (University of Maryland College Park) Year: 2025
© 2026 ScienceGate Book Chapters — All rights reserved.