Ahmad AlhawaratT. Nguyen‐ThoiRamadan SabraZabidin Salleh
To find a solution of unconstrained optimization problems, we normally use a conjugate gradient (CG) method since it does not cost memory or storage of second derivative like Newton’s method or Broyden–Fletcher–Goldfarb–Shanno (BFGS) method. Recently, a new modification of Polak and Ribiere method was proposed with new restart condition to give a so-call AZPRP method. In this paper, we propose a new modification of AZPRP CG method to solve large-scale unconstrained optimization problems based on a modification of restart condition. The new parameter satisfies the descent property and the global convergence analysis with the strong Wolfe-Powell line search. The numerical results prove that the new CG method is strongly aggressive compared with CG_Descent method. The comparisons are made under a set of more than 140 standard functions from the CUTEst library. The comparison includes number of iterations and CPU time.
Shahrul Azmi Mohd YusofMohd AsrulHery IbrahimMohd RivaeiMustafa MamatMohamad AfendeeLiza PuspaGhazaliM Al-BaaliN AiniM RivaieM MamatN AndreiE DolanJ MoreN GhaniM RivaieM MamatMN HajarM MamatM RivaieI JusohM HamodaM RivaieM MamatMZ SallehM IbrahimM MamatL JuneM IbrahimM MamatL JuneA SofiM IbrahimM MamatA SofiI MohdW AhmadW KhadijahM RivaieM MamatI JusohX LongL HuL ZhangN MohamedM MamatF MohamadM RivaieJ MoreB GarbowK HillstromM RivaieA AbasharM MamatI MohdIN ShapieeM AliM MamatZ SallehS ShoidM RivaieM MamatZ SallehA SofiM MamatI MohdM IbrahimA SofiM MamatI MohdY DasrilZ WeiS YaoL LiuN ZullM RivaieM MamatZ SallehZ Amani
Fenghua WenZhifeng LiuBingquan LiuZhifeng Dai
Parvaneh FaramarziKeyvan Amini
Zabidin SallehGhaliah AlhamziIbitsam MasmaliAhmad Alhawarat