JOURNAL ARTICLE

Faster Riemannian Newton-type optimization by subsampling and cubic regularization

Yian DengTingting Mu

Year: 2023 Journal:   Machine Learning Vol: 112 (9)Pages: 3527-3589   Publisher: Springer Science+Business Media

Abstract

Abstract This work is on constrained large-scale non-convex optimization where the constraint set implies a manifold structure. Solving such problems is important in a multitude of fundamental machine learning tasks. Recent advances on Riemannian optimization have enabled the convenient recovery of solutions by adapting unconstrained optimization algorithms over manifolds. However, it remains challenging to scale up and meanwhile maintain stable convergence rates and handle saddle points. We propose a new second-order Riemannian optimization algorithm, aiming at improving convergence rate and reducing computational cost. It enhances the Riemannian trust-region algorithm that explores curvature information to escape saddle points through a mixture of subsampling and cubic regularization techniques. We conduct rigorous analysis to study the convergence behavior of the proposed algorithm. We also perform extensive experiments to evaluate it based on two general machine learning tasks using multiple datasets. The proposed algorithm exhibits improved computational speed, e.g., a speed improvement from $$12\% \:\text {to} \:227\%$$ 12 % to 227 % , and improved convergence behavior, e.g., an iteration number reduction from $$\mathcal{O}\left(\max\left(\epsilon_g^{-2}\epsilon_H^{-1},\epsilon_H^{-3}\right)\right) \,\text {to}\: \mathcal{O}\left(\max\left(\epsilon_g^{-2},\epsilon_H^{-3}\right)\right)$$ O max ϵ g - 2 ϵ H - 1 , ϵ H - 3 to O max ϵ g - 2 , ϵ H - 3 , compared to a large set of state-of-the-art Riemannian optimization algorithms.

Keywords:
Algorithm Regularization (linguistics) Convergence (economics) Saddle point Computer science Curvature Mathematics Machine learning Artificial intelligence Geometry

Metrics

2
Cited By
0.67
FWCI (Field Weighted Citation Impact)
70
Refs
0.55
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Human Pose and Action Recognition
Physical Sciences →  Computer Science →  Computer Vision and Pattern Recognition

Related Documents

JOURNAL ARTICLE

Cubic Regularization Technique of the Newton Method for Vector Optimization

Debdas Ghosh

Journal:   Journal of Optimization Theory and Applications Year: 2025 Vol: 207 (2)
JOURNAL ARTICLE

Riemannian Stochastic Variance-Reduced Cubic Regularized Newton Method for Submanifold Optimization

Dewei ZhangSam Davanloo Tajbakhsh

Journal:   Journal of Optimization Theory and Applications Year: 2022 Vol: 196 (1)Pages: 324-361
JOURNAL ARTICLE

Manifold regularization based on Nyström type subsampling

AbhishakeS. Sivananthan

Journal:   Applied and Computational Harmonic Analysis Year: 2018 Vol: 49 (1)Pages: 152-179
© 2026 ScienceGate Book Chapters — All rights reserved.