•  English
    • Persian
    • English
  •   Login
  • Ferdowsi University of Mashhad
  • |
  • Information Center and Central Library
    • Persian
    • English
  • Home
  • Source Types
    • Journal Paper
    • Ebook
    • Conference Paper
    • Standard
    • Protocol
    • Thesis
  • Use Help
View Item 
  •   FUM Digital Library
  • Fum
  • Articles
  • ProfDoc
  • View Item
  •   FUM Digital Library
  • Fum
  • Articles
  • ProfDoc
  • View Item
  • All Fields
  • Title
  • Author
  • Year
  • Publisher
  • Subject
  • Publication Title
  • ISSN
  • DOI
  • ISBN
Advanced Search
JavaScript is disabled for your browser. Some features of this site may not work without it.

A modified scaled conjugate gradient method with global convergence for nonconvex functions

Author:
Saman Babaie-Kafaki
,
رضا قنبری
,
Reza Ghanbari
Year
: 2014
Abstract: Following Andrei’s approach, a modified scaled memoryless BFGS preconditioned conjugate gradient method is proposed based on the modified secant equation suggested by Li and Fukushima. It is shown that the method is globally convergent without convexity assumption on the objective function. Furthermore, for uniformly convex objective functions, sufficient descent property of the method is established based on an eigenvalue analysis. Numerical experiments are employed to demonstrate the efficiency of the method.
URI: http://libsearch.um.ac.ir:80/fum/handle/fum/3350650
Keyword(s): Unconstrained optimization,Conjugate gradient algorithm,Secant equation,Descent condition,Global convergence
Collections :
  • ProfDoc
  • Show Full MetaData Hide Full MetaData
  • Statistics

    A modified scaled conjugate gradient method with global convergence for nonconvex functions

Show full item record

contributor authorSaman Babaie-Kafakien
contributor authorرضا قنبریen
contributor authorReza Ghanbarifa
date accessioned2020-06-06T13:20:06Z
date available2020-06-06T13:20:06Z
date issued2014
identifier urihttp://libsearch.um.ac.ir:80/fum/handle/fum/3350650
description abstractFollowing Andrei’s approach, a modified scaled memoryless BFGS preconditioned conjugate gradient method is proposed based on the modified secant equation suggested by Li and Fukushima. It is shown that the method is globally convergent without convexity assumption on the objective function. Furthermore, for uniformly convex objective functions, sufficient descent property of the method is established based on an eigenvalue analysis. Numerical experiments are employed to demonstrate the efficiency of the method.en
languageEnglish
titleA modified scaled conjugate gradient method with global convergence for nonconvex functionsen
typeJournal Paper
contenttypeExternal Fulltext
subject keywordsUnconstrained optimizationen
subject keywordsConjugate gradient algorithmen
subject keywordsSecant equationen
subject keywordsDescent conditionen
subject keywordsGlobal convergenceen
journal titleBulletin of the Belgian Mathematical Societyen
journal titleBulletin of the Belgian Mathematical Society-Simon Stevinfa
pages465-477
journal volume21
journal issue3
identifier linkhttps://profdoc.um.ac.ir/paper-abstract-1043071.html
identifier articleid1043071
  • About Us
نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
DSpace software copyright © 2019-2022  DuraSpace