A Modification of Conjugate Gradient Parameter and Its Global Convergence for Solving Unconstrained Optimization Problems
DOI:
https://doi.org/10.32890/jcia2025.4.2.3Abstrak
The methods of nonlinear conjugate gradient coefficients are significant and helpful in solving large-scale unconstrained optimization problems, due to their simplicity and lower storage requirement. Research activities on its applications to handle higher-dimensional systems of nonlinear equations are of paramount importance. Many authors studied and developed different kinds of conjugate gradient coefficients. Recently, some conjugate gradients were proposed with large dimensions, but they have small sample sizes. So, they cannot solve problems that have higher dimensions. Therefore, this research proposed a modified algorithm with a huge dimension and a large sample size. The strategy of strong Wolfe line search was applied in the convergence analysis, which makes it possible to converge globally with a descent property. Finally, the numerical results show that the proposed algorithm performs more efficiently and is superior to the existing Conjugate Gradient (CG) coefficients.














