JOURNAL ARTICLE

Zeroth and First Order Stochastic Frank-Wolfe Algorithms for Constrained Optimization

Zeeshan AkhtarKetan Rajawat

Year: 2022 Journal:   IEEE Transactions on Signal Processing Vol: 70 Pages: 2119-2135   Publisher: Institute of Electrical and Electronics Engineers

Abstract

This paper considers stochastic convex optimization problems with two sets of\nconstraints: (a) deterministic constraints on the domain of the optimization\nvariable, which are difficult to project onto; and (b) deterministic or\nstochastic constraints that admit efficient projection. Problems of this form\narise frequently in the context of semidefinite programming as well as when\nvarious NP-hard problems are solved approximately via semidefinite relaxation.\nSince projection onto the first set of constraints is difficult, it becomes\nnecessary to explore projection-free algorithms, such as the stochastic\nFrank-Wolfe (FW) algorithm. On the other hand, the second set of constraints\ncannot be handled in the same way, and must be incorporated as an indicator\nfunction within the objective function, thereby complicating the application of\nFW methods. Similar problems have been studied before; however, they suffer\nfrom slow convergence rates. This work, equipped with momentum based gradient\ntracking technique, guarantees fast convergence rates on par with the\nbest-known rates for problems without the second set of constraints.\nZeroth-order variants of the proposed algorithms are also developed and again\nimprove upon the state-of-the-art rate results. We further propose the novel\ntrimmed FW variants that enjoy the same convergence rates as their classical\ncounterparts, but are empirically shown to require significantly fewer calls to\nthe linear minimization oracle speeding up the overall algorithm. The efficacy\nof the proposed algorithms is tested on relevant applications of sparse matrix\nestimation, clustering via semidefinite relaxation, and uniform sparsest cut\nproblem.\n

Keywords:
Semidefinite programming Rate of convergence Mathematical optimization Mathematics Relaxation (psychology) Algorithm Context (archaeology) Proximal Gradient Methods Stochastic optimization Convergence (economics) Oracle Optimization problem Convex optimization Computer science Regular polygon Key (lock)

Metrics

9
Cited By
2.28
FWCI (Field Weighted Citation Impact)
126
Refs
0.77
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Optimization Algorithms Research
Physical Sciences →  Mathematics →  Numerical Analysis
© 2026 ScienceGate Book Chapters — All rights reserved.