Show simple item record

contributor authorSaman Babaie-Kafakien
contributor authorرضا قنبریen
contributor authorReza Ghanbarifa
date accessioned2020-06-06T13:20:06Z
date available2020-06-06T13:20:06Z
date issued2014
identifier urihttps://libsearch.um.ac.ir:443/fum/handle/fum/3350650?show=full
description abstractFollowing Andrei’s approach, a modified scaled memoryless BFGS preconditioned conjugate gradient method is proposed based on the modified secant equation suggested by Li and Fukushima. It is shown that the method is globally convergent without convexity assumption on the objective function. Furthermore, for uniformly convex objective functions, sufficient descent property of the method is established based on an eigenvalue analysis. Numerical experiments are employed to demonstrate the efficiency of the method.en
languageEnglish
titleA modified scaled conjugate gradient method with global convergence for nonconvex functionsen
typeJournal Paper
contenttypeExternal Fulltext
subject keywordsUnconstrained optimizationen
subject keywordsConjugate gradient algorithmen
subject keywordsSecant equationen
subject keywordsDescent conditionen
subject keywordsGlobal convergenceen
journal titleBulletin of the Belgian Mathematical Societyen
journal titleBulletin of the Belgian Mathematical Society-Simon Stevinfa
pages465-477
journal volume21
journal issue3
identifier linkhttps://profdoc.um.ac.ir/paper-abstract-1043071.html
identifier articleid1043071


Files in this item

FilesSizeFormatView

There are no files associated with this item.

This item appears in the following Collection(s)

Show simple item record