J4 ›› 2009, Vol. 44 ›› Issue (7): 33-37.

• Articles • Previous Articles     Next Articles

A new class of memory gradient methods with Wolfe line search

TANG Jingyong, DONG Li, ZHANG Xiujun   

  1. College of Mathematics and Information Science, Xinyang Normal Unive rsity, Xinyang 464000, Henan, China
  • Received:2008-11-17 Online:2009-07-16 Published:2009-11-01

Abstract:

A new class of memory gradient methods for unconstraind optimization problems was presented. Its global convergence and linear convergence rate were proved under some mild conditions. The methods are suitable to solve large scale optimization problems by using the current and previous iterative information, which generate the descent direction and avoiding the computation and the storage of some matrices. Numerical experiments show that the new methods are more efficient then FR, PRP, HS conjugate gradient methods and the steepest descent method under the condition of the Wolfe line search.

Key words: unconstrained optimization; memory gradient method; global co nvergence; linear convergence rate

CLC Number: 

  • O2212
No related articles found!
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!