On the asymptotic behaviour of some new gradient methods

Roger Fletcher, Yu-Hong Dai

    Research output: Contribution to journalArticlepeer-review

    67 Citations (Scopus)

    Abstract

    The Barzilai-Borwein (BB) gradient method, and some other new gradient methods have shown themselves to be competitive with conjugate gradient methods for solving large dimension nonlinear unconstrained optimization problems. Little is known about the asymptotic behaviour, even when applied to n-dimensional quadratic functions, except in the case that n=2. We show in the quadratic case how it is possible to compute this asymptotic behaviour, and observe that as n increases there is a transition from superlinear to linear convergence at some value of n=4, depending on the method. By neglecting certain terms in the recurrence relations we define simplified versions of the methods, which are able to predict this transition. The simplified methods also predict that for larger values of n, the eigencomponents of the gradient vectors converge in modulus to a common value, which is a similar to a property observed to hold in the real methods. Some unusual and interesting recurrence relations are analysed in the course of the study.
    Original languageEnglish
    Pages (from-to)541-559
    Number of pages19
    JournalMathematical Programming
    Volume103
    Issue number3
    DOIs
    Publication statusPublished - 2005

    Fingerprint

    Dive into the research topics of 'On the asymptotic behaviour of some new gradient methods'. Together they form a unique fingerprint.

    Cite this