Mathematical Problems in Engineering
Volume 2009 (2009), Article ID 243290, 16 pages
doi:10.1155/2009/243290
Research Article

Nonlinear Conjugate Gradient Methods with Sufficient Descent Condition for Large-Scale Unconstrained Optimization

1Institute of Applied Mathematics, College of Mathematics and Information Science, Henan University, Kaifeng 475000, China
2Department of Mathematics, Nanjing University, Nanjing 210093, China
3College of Mathematics and Information Science, Guangxi University, Nanning 530004, China

Received 3 January 2009; Revised 21 February 2009; Accepted 1 May 2009

Academic Editor: Joaquim J. Júdice

Copyright © 2009 Jianguo Zhang et al. This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.

Abstract

Two nonlinear conjugate gradient-type methods for solving unconstrained optimization problems are proposed. An attractive property of the methods, is that, without any line search, the generated directions always descend. Under some mild conditions, global convergence results for both methods are established. Preliminary numerical results show that these proposed methods are promising, and competitive with the well-known PRP method.