Treffer: Some remarks on conjugate gradient methods without line search
CC BY 4.0
Sauf mention contraire ci-dessus, le contenu de cette notice bibliographique peut être utilisé dans le cadre d’une licence CC BY 4.0 Inist-CNRS / Unless otherwise stated above, the content of this bibliographic record may be used under a CC BY 4.0 licence by Inist-CNRS / A menos que se haya señalado antes, el contenido de este registro bibliográfico puede ser utilizado al amparo de una licencia CC BY 4.0 Inist-CNRS
Operational research. Management
Weitere Informationen
The conjugate gradient method is widely used in unconstrained optimization, especially in case of large-scale problems. However, the line search in the conjugate gradient method is sometimes very difficult or prohibitively expensive. In Sun and Zhang [J. Sun, J.P. Zhang, Global convergence of conjugate gradient methods without line search. Annals of Operations Research 103 (2001) 161-173], it is shown that by taking a fixed steplength αk defined by the formula αk = δg1kdk dTkOkdk, the conjugate gradient method is globally convergent for several popular choices of βk without line search. In the simplest case all Qk could be identity matrices. However, it would not even guarantee the descent property. In this paper, we study some methods to select Qk, which are based on the amount of descent and are superior to taking Qk ≡ I (the unit matrix).