Abstract
The Barzilai-Borwein (BB) gradient method, and some other new gradient methods have shown themselves to be competitive with conjugate gradient methods for solving large dimension nonlinear unconstrained optimization problems. Little is known about the asymptotic behaviour, even when applied to n-dimensional quadratic functions, except in the case that n=2. We show in the quadratic case how it is possible to compute this asymptotic behaviour, and observe that as n increases there is a transition from superlinear to linear convergence at some value of n=4, depending on the method. By neglecting certain terms in the recurrence relations we define simplified versions of the methods, which are able to predict this transition. The simplified methods also predict that for larger values of n, the eigencomponents of the gradient vectors converge in modulus to a common value, which is a similar to a property observed to hold in the real methods. Some unusual and interesting recurrence relations are analysed in the course of the study.
Original language | English |
---|---|
Pages (from-to) | 541-559 |
Number of pages | 19 |
Journal | Mathematical Programming |
Volume | 103 |
Issue number | 3 |
DOIs | |
Publication status | Published - 2005 |