JOURNAL ARTICLE

Compressed Differentially Private Distributed Optimization with Linear Convergence

Antai XieXinlei YiXiaofan WangMing CaoXiaoqiang Ren

Year: 2023 Journal:   IFAC-PapersOnLine Vol: 56 (2)Pages: 8369-8374   Publisher: Elsevier BV

Abstract

This paper addresses the problem of differentially private distributed optimization under limited communication, where each agent aims to keep their cost function private while minimizing the sum of all agents' cost functions. In response, we propose a novel Compressed differentially Private distributed Gradient Tracking algorithm (CPGT). We demonstrate that CPGT achieves linear convergence for smooth and strongly convex cost functions, even with a class of biased but contractive compressors, and achieves the same accuracy as the idealized communication algorithm. Additionally, we rigorously prove that CPGT ensures differential privacy. Simulations are provided to validate the effectiveness of the proposed algorithm.

Keywords:
Convergence (economics) Differential privacy Convex function Computer science Mathematical optimization Convex optimization Function (biology) Regular polygon Optimization problem Class (philosophy) Algorithm Mathematics Artificial intelligence

Metrics

4
Cited By
1.76
FWCI (Field Weighted Citation Impact)
31
Refs
0.75
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Distributed Control Multi-Agent Systems
Physical Sciences →  Computer Science →  Computer Networks and Communications
Privacy-Preserving Technologies in Data
Physical Sciences →  Computer Science →  Artificial Intelligence
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.