Treffer: Some remarks on conjugate gradient methods without line search

Title:
Some remarks on conjugate gradient methods without line search
Authors:
Source:
Applied mathematics and computation. 181(1):370-379
Publisher Information:
New York, NY: Elsevier, 2006.
Publication Year:
2006
Physical Description:
print, 21 ref
Original Material:
INIST-CNRS
Subject Terms:
Control theory, operational research, Automatique, recherche opérationnelle, Computer science, Informatique, Mathematics, Mathématiques, Sciences exactes et technologie, Exact sciences and technology, Sciences et techniques communes, Sciences and techniques of general use, Mathematiques, Mathematics, Analyse mathématique, Mathematical analysis, Calcul des variations et contrôle optimal, Calculus of variations and optimal control, Sciences appliquees, Applied sciences, Recherche operationnelle. Gestion, Operational research. Management science, Recherche opérationnelle et modèles formalisés de gestion, Operational research and scientific management, Optimisation. Problèmes de recherche, Optimization. Search problems, Convergence, Convergencia, Echelle grande, Large scale, Escala grande, Equation dérivée partielle, Partial differential equation, Ecuación derivada parcial, Mathématiques appliquées, Applied mathematics, Matemáticas aplicadas, Méthode gradient conjugué, Conjugate gradient method, Método gradiente conjugado, Méthode lignes, Method of lines, Método líneas, Optimisation sans contrainte, Unconstrained optimization, Optimización sin restricción, Problème valeur initiale, Initial value problem, Problema valor inicial, Problème valeur limite, Boundary value problem, Problema valor limite, Recherche opérationnelle, Operations research, Investigación operacional, Convergence globale, Méthode BFGS, BFGS method, Méthode minimisante, Minimizing method, Recherche sans ligne, Without line search, Conjugate gradient methods: Without line search, Fixed steplength, Limited memory BFGS method
Document Type:
Fachzeitschrift Article
File Description:
text
Language:
English
Author Affiliations:
Department of Mathematics, Zhejiang University, Hangzhou, China
ISSN:
0096-3003
Rights:
Copyright 2007 INIST-CNRS
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Notes:
Mathematics

Operational research. Management
Accession Number:
edscal.18364155
Database:
PASCAL Archive

Weitere Informationen

The conjugate gradient method is widely used in unconstrained optimization, especially in case of large-scale problems. However, the line search in the conjugate gradient method is sometimes very difficult or prohibitively expensive. In Sun and Zhang [J. Sun, J.P. Zhang, Global convergence of conjugate gradient methods without line search. Annals of Operations Research 103 (2001) 161-173], it is shown that by taking a fixed steplength αk defined by the formula αk = δg1kdk dTkOkdk, the conjugate gradient method is globally convergent for several popular choices of βk without line search. In the simplest case all Qk could be identity matrices. However, it would not even guarantee the descent property. In this paper, we study some methods to select Qk, which are based on the amount of descent and are superior to taking Qk ≡ I (the unit matrix).