Youqing HuaShuai LiuYiguang HongWei Ren
The dual challenges of prohibitive communication overhead and the impracticality of gradient computation due to data privacy or black-box constraints in distributed systems motivate this work on communication-constrained gradient-free optimization. We propose a stochastic distributed zeroth-order algorithm (Com-DSZO) requiring only two function evaluations per iteration, integrated with general compression operators. Rigorous analysis establishes its sublinear convergence rate for both smooth and nonsmooth objectives, while explicitly elucidating the compression-convergence trade-off. Furthermore, we develop a variance-reduced variant (VR-Com-DSZO) under stochastic mini-batch feedback. The empirical algorithm performance are illustrated with numerical examples.
Haonan WangXinlei YiYiguang Hong
Keito InoueNaoki HayashiShigemasa Takai
Lei XuXinlei YiJiayue SunGuanghui WenTianyou ChaiTao Yang
Xinlei YiShengjun ZhangTao YangKarl Henrik Johansson
Danqi JinYitong ChenJie ChenWen Zhang