JOURNAL ARTICLE

A Nonlinear Conjugate Gradient Algorithm under Strong Wolfe-Powell Line Search for Large Scale Unconstrained Optimization Problems

Abstract

The conventional conjugate gradient method solves linear and quadratic optimization problems but most real life problems consist of nonquadratic functions of several variables. In this work a nonlinear conjugate gradient algorithm for solving large scale optimization problems is presented. The new algorithm is a modification of the Fletcher-Reeves conjugate gradient method and it is proved to achieve global convergence under the strong Wolfe-Powell inexact line search technique. Computational experiments show that the new algorithm presented performs better than the Fletcher-Reeves exact line search algorithm in solving high dimensional nonlinear optimization problems.

Keywords:
Conjugate gradient method Nonlinear conjugate gradient method Line search Convergence (economics) Mathematical optimization Mathematics Gradient method Conjugate residual method Algorithm Nonlinear system Scale (ratio) Line (geometry) Derivation of the conjugate gradient method Computer science Gradient descent Artificial intelligence Artificial neural network

Metrics

0
Cited By
0.00
FWCI (Field Weighted Citation Impact)
5
Refs
0.36
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Topics

Advanced Optimization Algorithms Research
Physical Sciences →  Mathematics →  Numerical Analysis
Sparse and Compressive Sensing Techniques
Physical Sciences →  Engineering →  Computational Mechanics
Metaheuristic Optimization Algorithms Research
Physical Sciences →  Computer Science →  Artificial Intelligence
© 2026 ScienceGate Book Chapters — All rights reserved.