Discovery - University of Dundee - Online Publications

Library & Learning Centre

On the asymptotic behaviour of some new gradient methods

Standard

On the asymptotic behaviour of some new gradient methods. / Fletcher, Roger; Dai, Yu-Hong.

In: Mathematical Programming, Vol. 103, No. 3, 2005, p. 541-559.

Research output: Contribution to journalArticle

Harvard

Fletcher, R & Dai, Y-H 2005, 'On the asymptotic behaviour of some new gradient methods' Mathematical Programming, vol 103, no. 3, pp. 541-559., 10.1007/s10107-004-0516-9

APA

Fletcher, R., & Dai, Y-H. (2005). On the asymptotic behaviour of some new gradient methods. Mathematical Programming, 103(3), 541-559. 10.1007/s10107-004-0516-9

Vancouver

Fletcher R, Dai Y-H. On the asymptotic behaviour of some new gradient methods. Mathematical Programming. 2005;103(3):541-559. Available from: 10.1007/s10107-004-0516-9

Author

Fletcher, Roger; Dai, Yu-Hong / On the asymptotic behaviour of some new gradient methods.

In: Mathematical Programming, Vol. 103, No. 3, 2005, p. 541-559.

Research output: Contribution to journalArticle

Bibtex - Download

@article{9b11dbce77bc468cb2de8d7c4081cdf1,
title = "On the asymptotic behaviour of some new gradient methods",
author = "Roger Fletcher and Yu-Hong Dai",
note = "dc.publisher: Springer",
year = "2005",
doi = "10.1007/s10107-004-0516-9",
volume = "103",
number = "3",
pages = "541--559",
journal = "Mathematical Programming",

}

RIS (suitable for import to EndNote) - Download

TY - JOUR

T1 - On the asymptotic behaviour of some new gradient methods

A1 - Fletcher,Roger

A1 - Dai,Yu-Hong

AU - Fletcher,Roger

AU - Dai,Yu-Hong

PY - 2005

Y1 - 2005

N2 - The Barzilai-Borwein (BB) gradient method, and some other new gradient methods have shown themselves to be competitive with conjugate gradient methods for solving large dimension nonlinear unconstrained optimization problems. Little is known about the asymptotic behaviour, even when applied to n-dimensional quadratic functions, except in the case that n=2. We show in the quadratic case how it is possible to compute this asymptotic behaviour, and observe that as n increases there is a transition from superlinear to linear convergence at some value of n=4, depending on the method. By neglecting certain terms in the recurrence relations we define simplified versions of the methods, which are able to predict this transition. The simplified methods also predict that for larger values of n, the eigencomponents of the gradient vectors converge in modulus to a common value, which is a similar to a property observed to hold in the real methods. Some unusual and interesting recurrence relations are analysed in the course of the study.

AB - The Barzilai-Borwein (BB) gradient method, and some other new gradient methods have shown themselves to be competitive with conjugate gradient methods for solving large dimension nonlinear unconstrained optimization problems. Little is known about the asymptotic behaviour, even when applied to n-dimensional quadratic functions, except in the case that n=2. We show in the quadratic case how it is possible to compute this asymptotic behaviour, and observe that as n increases there is a transition from superlinear to linear convergence at some value of n=4, depending on the method. By neglecting certain terms in the recurrence relations we define simplified versions of the methods, which are able to predict this transition. The simplified methods also predict that for larger values of n, the eigencomponents of the gradient vectors converge in modulus to a common value, which is a similar to a property observed to hold in the real methods. Some unusual and interesting recurrence relations are analysed in the course of the study.

U2 - 10.1007/s10107-004-0516-9

DO - 10.1007/s10107-004-0516-9

M1 - Article

JO - Mathematical Programming

JF - Mathematical Programming

IS - 3

VL - 103

SP - 541

EP - 559

ER -

Documents

Library & Learning Centre

Contact | Accessibility | Policy