Abstract

In this paper, we present a new conjugate gradient method using an acceleration scheme for solving large-scale unconstrained optimization. The generated search direction satisfies both the sufficient descent condition and the Dai–Liao conjugacy condition independent of line search. Moreover, the value of the parameter contains more useful information without adding more computational cost and storage requirements, which can improve the numerical performance. Under proper assumptions, the global convergence result of the proposed method with a Wolfe line search is established. Numerical experiments show that the given method is competitive for unconstrained optimization problems, with a maximum dimension of 100,000.

Details

Title
A new accelerated conjugate gradient method for large-scale unconstrained optimization
Author
Chen, Yuting 1 ; Cao, Mingyuan 2 ; Yang, Yueting 2 

 College of Mathematics, Jilin University, Changchun, China; College of Mathematics and Statistics, Beihua University, Jilin, China 
 College of Mathematics and Statistics, Beihua University, Jilin, China 
Pages
1-13
Publication year
2019
Publication date
Nov 2019
Publisher
Springer Nature B.V.
ISSN
10255834
e-ISSN
1029242X
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2316320976
Copyright
Journal of Inequalities and Applications is a copyright of Springer, (2019). All Rights Reserved., © 2019. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.