Previous |  Up |  Next

Article

Keywords:
large scale unconstrained optimization; limited-memory variable metric method; BNS method; quasi-Newton method; convergence analysis; numerical experiments
Summary:
A modification of the limited-memory variable metric BNS method for large scale unconstrained optimization of the differentiable function $f:{\cal R}^N\to\cal R$ is considered, which consists in corrections (based on the idea of conjugate directions) of difference vectors for better satisfaction of the previous quasi-Newton conditions. In comparison with [11], more previous iterations can be utilized here. For quadratic objective functions, the improvement of convergence is the best one in some sense, all stored corrected difference vectors are conjugate and the quasi-Newton conditions with these vectors are satisfied. The algorithm is globally convergent for convex sufficiently smooth functions and our numerical experiments indicate its efficiency.
Partner of
EuDML logo