JOURNAL ARTICLE

Finite-Time Analysis of Markov Gradient Descent

Thinh T. Doan

Year: 2022 Journal:   IEEE Transactions on Automatic Control Vol: 68 (4)Pages: 2140-2153   Publisher: Institute of Electrical and Electronics Engineers

Abstract

Motivated by its broad applications in system identifications, stochastic control, and machine learning, we study the popular stochastic gradient descent (SGD) when the gradient samples of the underlying objective function are generated from Markov processes. This Markov sampling leads to the gradient samples being biased and not independent. The existing results for the convergence of SGD under Markov settings are often established under the assumption on the boundedness of either the iterates or the gradient samples. This assumption can be guaranteed through an impractical projection step to a compact set often defined over the unknown optimal solution of the underlying problem. In this article, we show that this projection step is unnecessary in many settings. In particular, we study the convergence of SGD under Markov samples without requiring any projection step for different objective functions, ranging from strongly convex to nonconvex objectives satisfying Polyak–Łojasiewicz conditions. We show that SGD converges nearly at the same rate with Markovian gradient samples as with independent gradient samples. The only difference is a logarithmic factor that accounts for the mixing time of the Markov process. Finally, we provide numerical experiments in robust identification problems to illustrate our theoretical results, where we consider SGD with samples generated by a Markov process.

Keywords:
Stochastic gradient descent Markov chain Mathematics Markov process Mathematical optimization Applied mathematics Projection (relational algebra) Iterated function Gradient descent Convergence (economics) Rate of convergence Gradient method Computer science Algorithm Artificial intelligence Statistics Mathematical analysis Artificial neural network

Metrics

6
Cited By
2.50
FWCI (Field Weighted Citation Impact)
44
Refs
0.83
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Markov Chains and Monte Carlo Methods
Physical Sciences →  Mathematics →  Statistics and Probability
Statistical Methods and Inference
Physical Sciences →  Mathematics →  Statistics and Probability
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.