JOURNAL ARTICLE

Adaptive Stochastic Variance Reduction for Subsampled Newton Method with Cubic Regularization

Junyu ZhangLin XiaoShuzhong Zhang

Year: 2021 Journal:   INFORMS Journal on Optimization Vol: 4 (1)Pages: 45-64   Publisher: Institute for Operations Research and the Management Sciences

Abstract

The cubic regularized Newton method of Nesterov and Polyak has become increasingly popular for nonconvex optimization because of its capability of finding an approximate local solution with a second order guarantee and its low iteration complexity. Several recent works extend this method to the setting of minimizing the average of N smooth functions by replacing the exact gradients and Hessians with subsampled approximations. It is shown that the total Hessian sample complexity can be reduced to be sublinear in N per iteration by leveraging stochastic variance reduction techniques. We present an adaptive variance reduction scheme for a subsampled Newton method with cubic regularization and show that the expected Hessian sample complexity is [Formula: see text] for finding an [Formula: see text]-approximate local solution (in terms of first and second order guarantees, respectively). Moreover, we show that the same Hessian sample complexity is retained with fixed sample sizes if exact gradients are used. The techniques of our analysis are different from previous works in that we do not rely on high probability bounds based on matrix concentration inequalities. Instead, we derive and utilize new bounds on the third and fourth order moments of the average of random matrices, which are of independent interest on their own.

Keywords:
Hessian matrix Variance reduction Mathematics Sublinear function Applied mathematics Regularization (linguistics) Reduction (mathematics) Newton's method Mathematical optimization Combinatorics Computer science Statistics Monte Carlo method Nonlinear system

Metrics

1
Cited By
0.00
FWCI (Field Weighted Citation Impact)
11
Refs
0.17
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Advanced Optimization Algorithms Research
Physical Sciences →  Mathematics →  Numerical Analysis

Related Documents

JOURNAL ARTICLE

Stochastic sub-sampled Newton method with variance reduction

Zhijian LuoYuntao Qian

Journal:   International Journal of Wavelets Multiresolution and Information Processing Year: 2019 Vol: 17 (06)Pages: 1950041-1950041
JOURNAL ARTICLE

Subsampled cubic regularization method for finite-sum minimization

Max L. N. Gonçalves

Journal:   Optimization Year: 2024 Vol: 74 (7)Pages: 1591-1614
JOURNAL ARTICLE

Riemannian Stochastic Variance-Reduced Cubic Regularized Newton Method for Submanifold Optimization

Dewei ZhangSam Davanloo Tajbakhsh

Journal:   Journal of Optimization Theory and Applications Year: 2022 Vol: 196 (1)Pages: 324-361
© 2026 ScienceGate Book Chapters — All rights reserved.