Search
Now showing items 1-3 of 3
A modified scaled conjugate gradient method with global convergence for nonconvex functions
Following Andrei’s approach, a modified scaled memoryless BFGS preconditioned conjugate gradient method is proposed based on the modified secant equation suggested by Li and Fukushima. It is shown that the method is globally convergent without...
Two modified three-term conjugate gradient methods with sufficient descent property
Theory Appl 153:733–757,2012) in which the search directions are computed using the secant equations in a way to achieve the sufficient descent property. One of the methods is shown to be globally convergent for uniformly convex objective functions while...
Two new conjugate gradient methods based on modified secant equations
Following the approach proposed by Dai and Liao, we introduce two nonlinear conjugate gradient methods for unconstrained optimization problems. One of our proposed methods is based on a modified version of the secant equation proposed by Zhang, Deng...