JOURNAL ARTICLE

Scaled Diagonal Gradient-Type Method with Extra Update for Large-Scale Unconstrained Optimization

Abstract

We present a new gradient method that uses scaling and extra updating within the diagonal updating for solving unconstrained optimization problem. The new method is in the frame of Barzilai and Borwein (BB) method, except that the Hessian matrix is approximated by a diagonal matrix rather than the multiple of identity matrix in the BB method. The main idea is to design a new diagonal updating scheme that incorporates scaling to instantly reduce the large eigenvalues of diagonal approximation and otherwise employs extra updates to increase small eigenvalues. These approaches give us a rapid control in the eigenvalues of the updating matrix and thus improve stepwise convergence. We show that our method is globally convergent. The effectiveness of the method is evaluated by means of numerical comparison with the BB method and its variant.

Keywords:
Hessian matrix Diagonal Scaling Diagonal matrix Eigenvalues and eigenvectors Matrix (chemical analysis) Identity matrix Frame (networking) Scheme (mathematics)

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
0
Refs
0.43
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Advanced Optimization Algorithms Research
Physical Sciences →  Mathematics →  Numerical Analysis
Stochastic Gradient Optimization Techniques
Physical Sciences →  Computer Science →  Artificial Intelligence
Matrix Theory and Algorithms
Physical Sciences →  Computer Science →  Computational Theory and Mathematics

Related Documents

© 2026 ScienceGate Book Chapters — All rights reserved.