•  Persian
    • Persian
    • English
  •   ورود
  • دانشگاه فردوسی مشهد
  • |
  • مرکز اطلاع‌رسانی و کتابخانه مرکزی
    • Persian
    • English
  • خانه
  • انواع منابع
    • مقاله مجله
    • کتاب الکترونیکی
    • مقاله همایش
    • استاندارد
    • پروتکل
    • پایان‌نامه
  • راهنمای استفاده
View Item 
  •   کتابخانه دیجیتال دانشگاه فردوسی مشهد
  • Fum
  • Articles
  • ProfDoc
  • View Item
  •   کتابخانه دیجیتال دانشگاه فردوسی مشهد
  • Fum
  • Articles
  • ProfDoc
  • View Item
  • همه
  • عنوان
  • نویسنده
  • سال
  • ناشر
  • موضوع
  • عنوان ناشر
  • ISSN
  • شناسه الکترونیک
  • شابک
جستجوی پیشرفته
JavaScript is disabled for your browser. Some features of this site may not work without it.

A modified scaled conjugate gradient method with global convergence for nonconvex functions

نویسنده:
Saman Babaie-Kafaki
,
رضا قنبری
,
Reza Ghanbari
سال
: 2014
چکیده: Following Andrei’s approach, a modified scaled memoryless BFGS preconditioned conjugate gradient method is proposed based on the modified secant equation suggested by Li and Fukushima. It is shown that the method is globally convergent without convexity assumption on the objective function. Furthermore, for uniformly convex objective functions, sufficient descent property of the method is established based on an eigenvalue analysis. Numerical experiments are employed to demonstrate the efficiency of the method.
یو آر آی: http://libsearch.um.ac.ir:80/fum/handle/fum/3350650
کلیدواژه(گان): Unconstrained optimization,Conjugate gradient algorithm,Secant equation,Descent condition,Global convergence
کالکشن :
  • ProfDoc
  • نمایش متادیتا پنهان کردن متادیتا
  • آمار بازدید

    A modified scaled conjugate gradient method with global convergence for nonconvex functions

Show full item record

contributor authorSaman Babaie-Kafakien
contributor authorرضا قنبریen
contributor authorReza Ghanbarifa
date accessioned2020-06-06T13:20:06Z
date available2020-06-06T13:20:06Z
date issued2014
identifier urihttp://libsearch.um.ac.ir:80/fum/handle/fum/3350650
description abstractFollowing Andrei’s approach, a modified scaled memoryless BFGS preconditioned conjugate gradient method is proposed based on the modified secant equation suggested by Li and Fukushima. It is shown that the method is globally convergent without convexity assumption on the objective function. Furthermore, for uniformly convex objective functions, sufficient descent property of the method is established based on an eigenvalue analysis. Numerical experiments are employed to demonstrate the efficiency of the method.en
languageEnglish
titleA modified scaled conjugate gradient method with global convergence for nonconvex functionsen
typeJournal Paper
contenttypeExternal Fulltext
subject keywordsUnconstrained optimizationen
subject keywordsConjugate gradient algorithmen
subject keywordsSecant equationen
subject keywordsDescent conditionen
subject keywordsGlobal convergenceen
journal titleBulletin of the Belgian Mathematical Societyen
journal titleBulletin of the Belgian Mathematical Society-Simon Stevinfa
pages465-477
journal volume21
journal issue3
identifier linkhttps://profdoc.um.ac.ir/paper-abstract-1043071.html
identifier articleid1043071
  • درباره ما
نرم افزار کتابخانه دیجیتال "دی اسپیس" فارسی شده توسط یابش برای کتابخانه های ایرانی | تماس با یابش
DSpace software copyright © 2019-2022  DuraSpace