JOURNAL ARTICLE

An Improved Conjugate Gradient Based Learning Algorithm For Back Propagation Neural Networks

Abstract

The conjugate gradient optimization algorithm is combined with the modified back propagation algorithm to yield a computationally efficient algorithm for training multilayer perceptron (MLP) networks (CGFR/AG). The computational efficiency is enhanced by adaptively modifying initial search direction as described in the following steps: (1) Modification on standard back propagation algorithm by introducing a gain variation term in the activation function, (2) Calculation of the gradient descent of error with respect to the weights and gains values and (3) the determination of a new search direction by using information calculated in step (2). The performance of the proposed method is demonstrated by comparing accuracy and computation time with the conjugate gradient algorithm used in MATLAB neural network toolbox. The results show that the computational efficiency of the proposed method was better than the standard conjugate gradient algorithm.

Keywords:
Conjugate gradient method Artificial neural network Backpropagation Computer science Conjugate Artificial intelligence Algorithm Mathematics

Metrics

53
Cited By
13.41
FWCI (Field Weighted Citation Impact)
0
Refs
0.99
Citation Normalized Percentile
Is in top 1%
Is in top 10%

Citation History

Topics

Advanced Algorithms and Applications
Physical Sciences →  Engineering →  Control and Systems Engineering
Advanced Sensor and Control Systems
Physical Sciences →  Engineering →  Control and Systems Engineering
© 2026 ScienceGate Book Chapters — All rights reserved.