Search
نمایش تعداد 1-10 از 53
A hybridization of the Polak-Ribière-Polyak and Fletcher-Reeves conjugate gradient methods
, global convergence of the method is established without convexity assumption on the objective function. The method is numerically compared with the three-term conjugate gradient method proposed by Zhang et al. and a modified version of the Polak...
A New Three–Term Conjugate Gradient Method with Descent Direction for Unconstrained Optimization
is closest
to the direction of the Newton method or satisfied conjugacy condition as the iterations
evolve. In addition, under mild condition, we prove global convergence properties of
the proposed method. Numerical comparison illustrates...
The Dai-Liao nonlinear conjugate gradient method with optimal parameter choices
, it is briefly shown that the methods are globally convergent when the line search fulfills the strong Wolfe conditions. Numerical comparisons between the implementations
of the proposed methods and the conjugate gradient methods proposed by Hager and Zhang...
Two optimal Dai–Liao conjugate gradient methods
. and the other one is obtained by minimizing Frobenius
condition number of the search direction matrix. Global convergence analyses are made briefly. Numerical results are reported; they demonstrate effectiveness of the suggested adaptive choices....
A modified scaled conjugate gradient method with global convergence for nonconvex functions
Following Andrei’s approach, a modified scaled memoryless BFGS preconditioned conjugate gradient method is proposed based on the modified secant equation suggested by Li and Fukushima. It is shown that the method is globally convergent without...
A descent family of Dai-Liao conjugate gradient methods
, and Dai and Kou, as special cases. It is shown that the methods of the suggested class are globally convergent for uniformly convex objective functions. Numerical results are reported, they demonstrate the efficiency of the proposed methods in the sense...
Two modified three-term conjugate gradient methods with sufficient descent property
Theory Appl 153:733–757,2012) in which the search directions are computed using the secant equations in a way to achieve the sufficient descent property. One of the methods is shown to be globally convergent for uniformly convex objective functions while...
An Adaptive Hager-Zhang Conjugate Gradient Method
analysis, a modified version of the Hager-Zhang method is
proposed, using an adaptive switch form the Hager-Zhang method to the Hestenes-Stiefel method when
the mentioned condition number is large. A brief global convergence analysis is made...
A descent hybrid modification of the Polak–Ribiere–Polyak conjugate gradient method
the sufficient descent condition independent of the line search and the objective function convexity. Similar to the Polak–Ribière–Polyak method, the method possesses an automatic restart feature which avoids jamming. Global convergence analyses are conducted...