تعداد نشریات | 31 |
تعداد شمارهها | 748 |
تعداد مقالات | 7,112 |
تعداد مشاهده مقاله | 10,247,257 |
تعداد دریافت فایل اصل مقاله | 6,900,529 |
A descent family of hybrid conjugate gradient methods with global convergence property for nonconvex functions | ||
Journal of Mathematical Modeling | ||
دوره 10، شماره 3، آذر 2022، صفحه 487-498 اصل مقاله (235.24 K) | ||
نوع مقاله: Research Article | ||
شناسه دیجیتال (DOI): 10.22124/jmm.2022.21772.1910 | ||
نویسنده | ||
Mina Lotfi* | ||
Department of Applied Mathematics, Tarbiat Modares University, P.O.Box 14115-175, Tehran, Iran | ||
چکیده | ||
In this paper, we present a new hybrid conjugate gradient method for unconstrained optimization that possesses sufficient descent property independent of any line search. In our method, a convex combination of the Hestenes-Stiefel (HS) and the Fletcher-Reeves (FR) methods, is used as the conjugate parameter and the hybridization parameter is determined by minimizing the distance between the hybrid conjugate gradient direction and direction of the three-term HS method proposed by M. Li (\emph{A family of three-term nonlinear conjugate gradient methods close to the memoryless BFGS method,} Optim. Lett. \textbf{12} (8) (2018) 1911--1927). Under some standard assumptions, the global convergence property on general functions is established. Numerical results on some test problems in the CUTEst library illustrate the efficiency and robustness of our proposed method in practice. | ||
کلیدواژهها | ||
Unconstrained optimization؛ conjugate gradient method؛ sufficient descent؛ least-squares؛ global convergence | ||
آمار تعداد مشاهده مقاله: 499 تعداد دریافت فایل اصل مقاله: 567 |