A New Stochastic Limited Memory BFGS Algorithm

Mojgan Momeni, Mohammad Reza Peyghami, Davoud Ataee Tarzanagh


In this paper, a new limited memory BFGS is proposed for solving stochastic optimization problems. Since the cost of storing and manipulating $H_k$ is prohibitive in the large scale setting, the L-BFGS algorithms use the strategy of keeping the most recent correction pairs. Besides, in the stochastic regime, due to some noisy information in both gradient vector and Hessian approximation, the second-order model is not an accurate estimation of the function. To overcome this problem, our L-BFGSemploys memory in an optimal manner by storing the correction pairs that have the least violation in the secant equation. Under some standard assumptions, the convergence property of the new algorithm is established for strongly convex functions. Numerical results on the problems arising in machine learning show that the new method is competitive and effective in practice.


Limited memory BFGS (L-BFGS); Stochastic optimization; Secant equation

Full Text: PDF


  • There are currently no refbacks.

Creative Commons License
This work is licensed under a Creative Commons Attribution 3.0 License.