unconstrained minimization; limited-memory variable metric methods; the repeated Byrd-Nocedal-Schnabel update; the Lyapunov matrix equation; the conjugate directions; global convergence; numerical results
To improve the performance of the L-BFGS method for large scale unconstrained optimization, repeating of some BFGS updates was proposed e.g. in . Since this can be time consuming, the extra updates need to be selected carefully. We show that groups of these updates can be repeated infinitely many times under some conditions, without a noticeable increase of the computational time; the limit update is a block BFGS update . It can be obtained by solving of some Lyapunov matrix equation whose order can be decreased by application of vector corrections for conjugacy . Global convergence of the proposed algorithm is established for convex and sufficiently smooth functions. Numerical results indicate the efficiency of the new method.